Oct 07 20:51:08 localhost kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct 07 20:51:08 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 07 20:51:08 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 07 20:51:08 localhost kernel: BIOS-provided physical RAM map:
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 07 20:51:08 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 07 20:51:08 localhost kernel: NX (Execute Disable) protection: active
Oct 07 20:51:08 localhost kernel: APIC: Static calls initialized
Oct 07 20:51:08 localhost kernel: SMBIOS 2.8 present.
Oct 07 20:51:08 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 07 20:51:08 localhost kernel: Hypervisor detected: KVM
Oct 07 20:51:08 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 07 20:51:08 localhost kernel: kvm-clock: using sched offset of 4038727541 cycles
Oct 07 20:51:08 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 07 20:51:08 localhost kernel: tsc: Detected 2800.000 MHz processor
Oct 07 20:51:08 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 07 20:51:08 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 07 20:51:08 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 07 20:51:08 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 07 20:51:08 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 07 20:51:08 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 07 20:51:08 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 07 20:51:08 localhost kernel: Using GB pages for direct mapping
Oct 07 20:51:08 localhost kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct 07 20:51:08 localhost kernel: ACPI: Early table checksum verification disabled
Oct 07 20:51:08 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 07 20:51:08 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 07 20:51:08 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 07 20:51:08 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 07 20:51:08 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 07 20:51:08 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 07 20:51:08 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 07 20:51:08 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 07 20:51:08 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 07 20:51:08 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 07 20:51:08 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 07 20:51:08 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 07 20:51:08 localhost kernel: No NUMA configuration found
Oct 07 20:51:08 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 07 20:51:08 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct 07 20:51:08 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 07 20:51:08 localhost kernel: Zone ranges:
Oct 07 20:51:08 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 07 20:51:08 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 07 20:51:08 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 07 20:51:08 localhost kernel:   Device   empty
Oct 07 20:51:08 localhost kernel: Movable zone start for each node
Oct 07 20:51:08 localhost kernel: Early memory node ranges
Oct 07 20:51:08 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 07 20:51:08 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 07 20:51:08 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 07 20:51:08 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 07 20:51:08 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 07 20:51:08 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 07 20:51:08 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 07 20:51:08 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 07 20:51:08 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 07 20:51:08 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 07 20:51:08 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 07 20:51:08 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 07 20:51:08 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 07 20:51:08 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 07 20:51:08 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 07 20:51:08 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 07 20:51:08 localhost kernel: TSC deadline timer available
Oct 07 20:51:08 localhost kernel: CPU topo: Max. logical packages:   8
Oct 07 20:51:08 localhost kernel: CPU topo: Max. logical dies:       8
Oct 07 20:51:08 localhost kernel: CPU topo: Max. dies per package:   1
Oct 07 20:51:08 localhost kernel: CPU topo: Max. threads per core:   1
Oct 07 20:51:08 localhost kernel: CPU topo: Num. cores per package:     1
Oct 07 20:51:08 localhost kernel: CPU topo: Num. threads per package:   1
Oct 07 20:51:08 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 07 20:51:08 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 07 20:51:08 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 07 20:51:08 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 07 20:51:08 localhost kernel: Booting paravirtualized kernel on KVM
Oct 07 20:51:08 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 07 20:51:08 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 07 20:51:08 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 07 20:51:08 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Oct 07 20:51:08 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 07 20:51:08 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 07 20:51:08 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 07 20:51:08 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct 07 20:51:08 localhost kernel: random: crng init done
Oct 07 20:51:08 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 07 20:51:08 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 07 20:51:08 localhost kernel: Fallback order for Node 0: 0 
Oct 07 20:51:08 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 07 20:51:08 localhost kernel: Policy zone: Normal
Oct 07 20:51:08 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 07 20:51:08 localhost kernel: software IO TLB: area num 8.
Oct 07 20:51:08 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 07 20:51:08 localhost kernel: ftrace: allocating 49370 entries in 193 pages
Oct 07 20:51:08 localhost kernel: ftrace: allocated 193 pages with 3 groups
Oct 07 20:51:08 localhost kernel: Dynamic Preempt: voluntary
Oct 07 20:51:08 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 07 20:51:08 localhost kernel: rcu:         RCU event tracing is enabled.
Oct 07 20:51:08 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 07 20:51:08 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 07 20:51:08 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 07 20:51:08 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 07 20:51:08 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 07 20:51:08 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 07 20:51:08 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 07 20:51:08 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 07 20:51:08 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 07 20:51:08 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 07 20:51:08 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 07 20:51:08 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 07 20:51:08 localhost kernel: Console: colour VGA+ 80x25
Oct 07 20:51:08 localhost kernel: printk: console [ttyS0] enabled
Oct 07 20:51:08 localhost kernel: ACPI: Core revision 20230331
Oct 07 20:51:08 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 07 20:51:08 localhost kernel: x2apic enabled
Oct 07 20:51:08 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Oct 07 20:51:08 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 07 20:51:08 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct 07 20:51:08 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 07 20:51:08 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 07 20:51:08 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 07 20:51:08 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 07 20:51:08 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 07 20:51:08 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 07 20:51:08 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 07 20:51:08 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 07 20:51:08 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 07 20:51:08 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 07 20:51:08 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 07 20:51:08 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 07 20:51:08 localhost kernel: x86/bugs: return thunk changed
Oct 07 20:51:08 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 07 20:51:08 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 07 20:51:08 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 07 20:51:08 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 07 20:51:08 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 07 20:51:08 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 07 20:51:08 localhost kernel: Freeing SMP alternatives memory: 40K
Oct 07 20:51:08 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 07 20:51:08 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 07 20:51:08 localhost kernel: landlock: Up and running.
Oct 07 20:51:08 localhost kernel: Yama: becoming mindful.
Oct 07 20:51:08 localhost kernel: SELinux:  Initializing.
Oct 07 20:51:08 localhost kernel: LSM support for eBPF active
Oct 07 20:51:08 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 07 20:51:08 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 07 20:51:08 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 07 20:51:08 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 07 20:51:08 localhost kernel: ... version:                0
Oct 07 20:51:08 localhost kernel: ... bit width:              48
Oct 07 20:51:08 localhost kernel: ... generic registers:      6
Oct 07 20:51:08 localhost kernel: ... value mask:             0000ffffffffffff
Oct 07 20:51:08 localhost kernel: ... max period:             00007fffffffffff
Oct 07 20:51:08 localhost kernel: ... fixed-purpose events:   0
Oct 07 20:51:08 localhost kernel: ... event mask:             000000000000003f
Oct 07 20:51:08 localhost kernel: signal: max sigframe size: 1776
Oct 07 20:51:08 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 07 20:51:08 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 07 20:51:08 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 07 20:51:08 localhost kernel: smpboot: x86: Booting SMP configuration:
Oct 07 20:51:08 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 07 20:51:08 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 07 20:51:08 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct 07 20:51:08 localhost kernel: node 0 deferred pages initialised in 24ms
Oct 07 20:51:08 localhost kernel: Memory: 7765660K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616508K reserved, 0K cma-reserved)
Oct 07 20:51:08 localhost kernel: devtmpfs: initialized
Oct 07 20:51:08 localhost kernel: x86/mm: Memory block size: 128MB
Oct 07 20:51:08 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 07 20:51:08 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 07 20:51:08 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 07 20:51:08 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 07 20:51:08 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 07 20:51:08 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 07 20:51:08 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 07 20:51:08 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 07 20:51:08 localhost kernel: audit: type=2000 audit(1759870265.876:1): state=initialized audit_enabled=0 res=1
Oct 07 20:51:08 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 07 20:51:08 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 07 20:51:08 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 07 20:51:08 localhost kernel: cpuidle: using governor menu
Oct 07 20:51:08 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 07 20:51:08 localhost kernel: PCI: Using configuration type 1 for base access
Oct 07 20:51:08 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 07 20:51:08 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 07 20:51:08 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 07 20:51:08 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 07 20:51:08 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 07 20:51:08 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 07 20:51:08 localhost kernel: Demotion targets for Node 0: null
Oct 07 20:51:08 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 07 20:51:08 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 07 20:51:08 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 07 20:51:08 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 07 20:51:08 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 07 20:51:08 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 07 20:51:08 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 07 20:51:08 localhost kernel: ACPI: Interpreter enabled
Oct 07 20:51:08 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 07 20:51:08 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 07 20:51:08 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 07 20:51:08 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 07 20:51:08 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 07 20:51:08 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 07 20:51:08 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [3] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [4] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [5] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [6] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [7] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [8] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [9] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [10] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [11] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [12] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [13] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [14] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [15] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [16] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [17] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [18] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [19] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [20] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [21] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [22] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [23] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [24] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [25] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [26] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [27] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [28] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [29] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [30] registered
Oct 07 20:51:08 localhost kernel: acpiphp: Slot [31] registered
Oct 07 20:51:08 localhost kernel: PCI host bridge to bus 0000:00
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 07 20:51:08 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 07 20:51:08 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 07 20:51:08 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 07 20:51:08 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct 07 20:51:08 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 07 20:51:08 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 07 20:51:08 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 07 20:51:08 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 07 20:51:08 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 07 20:51:08 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 07 20:51:08 localhost kernel: iommu: Default domain type: Translated
Oct 07 20:51:08 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 07 20:51:08 localhost kernel: SCSI subsystem initialized
Oct 07 20:51:08 localhost kernel: ACPI: bus type USB registered
Oct 07 20:51:08 localhost kernel: usbcore: registered new interface driver usbfs
Oct 07 20:51:08 localhost kernel: usbcore: registered new interface driver hub
Oct 07 20:51:08 localhost kernel: usbcore: registered new device driver usb
Oct 07 20:51:08 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 07 20:51:08 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 07 20:51:08 localhost kernel: PTP clock support registered
Oct 07 20:51:08 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 07 20:51:08 localhost kernel: NetLabel: Initializing
Oct 07 20:51:08 localhost kernel: NetLabel:  domain hash size = 128
Oct 07 20:51:08 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 07 20:51:08 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 07 20:51:08 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 07 20:51:08 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 07 20:51:08 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 07 20:51:08 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 07 20:51:08 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 07 20:51:08 localhost kernel: vgaarb: loaded
Oct 07 20:51:08 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 07 20:51:08 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 07 20:51:08 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 07 20:51:08 localhost kernel: pnp: PnP ACPI init
Oct 07 20:51:08 localhost kernel: pnp 00:03: [dma 2]
Oct 07 20:51:08 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 07 20:51:08 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 07 20:51:08 localhost kernel: NET: Registered PF_INET protocol family
Oct 07 20:51:08 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 07 20:51:08 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 07 20:51:08 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 07 20:51:08 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 07 20:51:08 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 07 20:51:08 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 07 20:51:08 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 07 20:51:08 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 07 20:51:08 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 07 20:51:08 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 07 20:51:08 localhost kernel: NET: Registered PF_XDP protocol family
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 07 20:51:08 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 07 20:51:08 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 07 20:51:08 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 07 20:51:08 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 71840 usecs
Oct 07 20:51:08 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 07 20:51:08 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 07 20:51:08 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 07 20:51:08 localhost kernel: ACPI: bus type thunderbolt registered
Oct 07 20:51:08 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 07 20:51:08 localhost kernel: Initialise system trusted keyrings
Oct 07 20:51:08 localhost kernel: Key type blacklist registered
Oct 07 20:51:08 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 07 20:51:08 localhost kernel: zbud: loaded
Oct 07 20:51:08 localhost kernel: integrity: Platform Keyring initialized
Oct 07 20:51:08 localhost kernel: integrity: Machine keyring initialized
Oct 07 20:51:08 localhost kernel: Freeing initrd memory: 86104K
Oct 07 20:51:08 localhost kernel: NET: Registered PF_ALG protocol family
Oct 07 20:51:08 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 07 20:51:08 localhost kernel: Key type asymmetric registered
Oct 07 20:51:08 localhost kernel: Asymmetric key parser 'x509' registered
Oct 07 20:51:08 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 07 20:51:08 localhost kernel: io scheduler mq-deadline registered
Oct 07 20:51:08 localhost kernel: io scheduler kyber registered
Oct 07 20:51:08 localhost kernel: io scheduler bfq registered
Oct 07 20:51:08 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 07 20:51:08 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 07 20:51:08 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 07 20:51:08 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 07 20:51:08 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 07 20:51:08 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 07 20:51:08 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 07 20:51:08 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 07 20:51:08 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 07 20:51:08 localhost kernel: Non-volatile memory driver v1.3
Oct 07 20:51:08 localhost kernel: rdac: device handler registered
Oct 07 20:51:08 localhost kernel: hp_sw: device handler registered
Oct 07 20:51:08 localhost kernel: emc: device handler registered
Oct 07 20:51:08 localhost kernel: alua: device handler registered
Oct 07 20:51:08 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 07 20:51:08 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 07 20:51:08 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 07 20:51:08 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 07 20:51:08 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 07 20:51:08 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 07 20:51:08 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 07 20:51:08 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct 07 20:51:08 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 07 20:51:08 localhost kernel: hub 1-0:1.0: USB hub found
Oct 07 20:51:08 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 07 20:51:08 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 07 20:51:08 localhost kernel: usbserial: USB Serial support registered for generic
Oct 07 20:51:08 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 07 20:51:08 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 07 20:51:08 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 07 20:51:08 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 07 20:51:08 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 07 20:51:08 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 07 20:51:08 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 07 20:51:08 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-07T20:51:07 UTC (1759870267)
Oct 07 20:51:08 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 07 20:51:08 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 07 20:51:08 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 07 20:51:08 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 07 20:51:08 localhost kernel: usbcore: registered new interface driver usbhid
Oct 07 20:51:08 localhost kernel: usbhid: USB HID core driver
Oct 07 20:51:08 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 07 20:51:08 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 07 20:51:08 localhost kernel: Initializing XFRM netlink socket
Oct 07 20:51:08 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 07 20:51:08 localhost kernel: Segment Routing with IPv6
Oct 07 20:51:08 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 07 20:51:08 localhost kernel: mpls_gso: MPLS GSO support
Oct 07 20:51:08 localhost kernel: IPI shorthand broadcast: enabled
Oct 07 20:51:08 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 07 20:51:08 localhost kernel: AES CTR mode by8 optimization enabled
Oct 07 20:51:08 localhost kernel: sched_clock: Marking stable (1185004180, 141614100)->(1434786950, -108168670)
Oct 07 20:51:08 localhost kernel: registered taskstats version 1
Oct 07 20:51:08 localhost kernel: Loading compiled-in X.509 certificates
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 07 20:51:08 localhost kernel: Demotion targets for Node 0: null
Oct 07 20:51:08 localhost kernel: page_owner is disabled
Oct 07 20:51:08 localhost kernel: Key type .fscrypt registered
Oct 07 20:51:08 localhost kernel: Key type fscrypt-provisioning registered
Oct 07 20:51:08 localhost kernel: Key type big_key registered
Oct 07 20:51:08 localhost kernel: Key type encrypted registered
Oct 07 20:51:08 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 07 20:51:08 localhost kernel: Loading compiled-in module X.509 certificates
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 07 20:51:08 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 07 20:51:08 localhost kernel: ima: No architecture policies found
Oct 07 20:51:08 localhost kernel: evm: Initialising EVM extended attributes:
Oct 07 20:51:08 localhost kernel: evm: security.selinux
Oct 07 20:51:08 localhost kernel: evm: security.SMACK64 (disabled)
Oct 07 20:51:08 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 07 20:51:08 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 07 20:51:08 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 07 20:51:08 localhost kernel: evm: security.apparmor (disabled)
Oct 07 20:51:08 localhost kernel: evm: security.ima
Oct 07 20:51:08 localhost kernel: evm: security.capability
Oct 07 20:51:08 localhost kernel: evm: HMAC attrs: 0x1
Oct 07 20:51:08 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 07 20:51:08 localhost kernel: Running certificate verification RSA selftest
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 07 20:51:08 localhost kernel: Running certificate verification ECDSA selftest
Oct 07 20:51:08 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 07 20:51:08 localhost kernel: clk: Disabling unused clocks
Oct 07 20:51:08 localhost kernel: Freeing unused decrypted memory: 2028K
Oct 07 20:51:08 localhost kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct 07 20:51:08 localhost kernel: Write protecting the kernel read-only data: 30720k
Oct 07 20:51:08 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct 07 20:51:08 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 07 20:51:08 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 07 20:51:08 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 07 20:51:08 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 07 20:51:08 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 07 20:51:08 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 07 20:51:08 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 07 20:51:08 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 07 20:51:08 localhost kernel: Run /init as init process
Oct 07 20:51:08 localhost kernel:   with arguments:
Oct 07 20:51:08 localhost kernel:     /init
Oct 07 20:51:08 localhost kernel:   with environment:
Oct 07 20:51:08 localhost kernel:     HOME=/
Oct 07 20:51:08 localhost kernel:     TERM=linux
Oct 07 20:51:08 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64
Oct 07 20:51:08 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 07 20:51:08 localhost systemd[1]: Detected virtualization kvm.
Oct 07 20:51:08 localhost systemd[1]: Detected architecture x86-64.
Oct 07 20:51:08 localhost systemd[1]: Running in initrd.
Oct 07 20:51:08 localhost systemd[1]: No hostname configured, using default hostname.
Oct 07 20:51:08 localhost systemd[1]: Hostname set to <localhost>.
Oct 07 20:51:08 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 07 20:51:08 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 07 20:51:08 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 07 20:51:08 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 07 20:51:08 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 07 20:51:08 localhost systemd[1]: Reached target Local File Systems.
Oct 07 20:51:08 localhost systemd[1]: Reached target Path Units.
Oct 07 20:51:08 localhost systemd[1]: Reached target Slice Units.
Oct 07 20:51:08 localhost systemd[1]: Reached target Swaps.
Oct 07 20:51:08 localhost systemd[1]: Reached target Timer Units.
Oct 07 20:51:08 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 07 20:51:08 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 07 20:51:08 localhost systemd[1]: Listening on Journal Socket.
Oct 07 20:51:08 localhost systemd[1]: Listening on udev Control Socket.
Oct 07 20:51:08 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 07 20:51:08 localhost systemd[1]: Reached target Socket Units.
Oct 07 20:51:08 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 07 20:51:08 localhost systemd[1]: Starting Journal Service...
Oct 07 20:51:08 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 07 20:51:08 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 07 20:51:08 localhost systemd[1]: Starting Create System Users...
Oct 07 20:51:08 localhost systemd[1]: Starting Setup Virtual Console...
Oct 07 20:51:08 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 07 20:51:08 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 07 20:51:08 localhost systemd[1]: Finished Create System Users.
Oct 07 20:51:08 localhost systemd-journald[306]: Journal started
Oct 07 20:51:08 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/bec7e4d59195425b8d0952635eb49950) is 8.0M, max 153.5M, 145.5M free.
Oct 07 20:51:08 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Oct 07 20:51:08 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Oct 07 20:51:08 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 07 20:51:08 localhost systemd[1]: Started Journal Service.
Oct 07 20:51:08 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 07 20:51:08 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 07 20:51:08 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 07 20:51:08 localhost systemd[1]: Finished Setup Virtual Console.
Oct 07 20:51:08 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 07 20:51:08 localhost systemd[1]: Starting dracut cmdline hook...
Oct 07 20:51:08 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 07 20:51:08 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Oct 07 20:51:08 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 07 20:51:08 localhost systemd[1]: Finished dracut cmdline hook.
Oct 07 20:51:08 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 07 20:51:08 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 07 20:51:08 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 07 20:51:08 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 07 20:51:08 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 07 20:51:08 localhost kernel: RPC: Registered udp transport module.
Oct 07 20:51:08 localhost kernel: RPC: Registered tcp transport module.
Oct 07 20:51:08 localhost kernel: RPC: Registered tcp-with-tls transport module.
Oct 07 20:51:08 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 07 20:51:08 localhost rpc.statd[445]: Version 2.5.4 starting
Oct 07 20:51:08 localhost rpc.statd[445]: Initializing NSM state
Oct 07 20:51:08 localhost rpc.idmapd[450]: Setting log level to 0
Oct 07 20:51:08 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 07 20:51:09 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 07 20:51:09 localhost systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Oct 07 20:51:09 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 07 20:51:09 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 07 20:51:09 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 07 20:51:09 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 07 20:51:09 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 07 20:51:09 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 07 20:51:09 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 07 20:51:09 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 07 20:51:09 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 07 20:51:09 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 07 20:51:09 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 07 20:51:09 localhost systemd[1]: Reached target Network.
Oct 07 20:51:09 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 07 20:51:09 localhost systemd[1]: Starting dracut initqueue hook...
Oct 07 20:51:09 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 07 20:51:09 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 07 20:51:09 localhost systemd[1]: Reached target System Initialization.
Oct 07 20:51:09 localhost systemd[1]: Reached target Basic System.
Oct 07 20:51:09 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 07 20:51:09 localhost kernel:  vda: vda1
Oct 07 20:51:09 localhost kernel: libata version 3.00 loaded.
Oct 07 20:51:09 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 07 20:51:09 localhost kernel: scsi host0: ata_piix
Oct 07 20:51:09 localhost kernel: scsi host1: ata_piix
Oct 07 20:51:09 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct 07 20:51:09 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct 07 20:51:09 localhost systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 07 20:51:09 localhost systemd[1]: Reached target Initrd Root Device.
Oct 07 20:51:09 localhost kernel: ata1: found unknown device (class 0)
Oct 07 20:51:09 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 07 20:51:09 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 07 20:51:09 localhost systemd-udevd[482]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 20:51:09 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 07 20:51:09 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 07 20:51:09 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 07 20:51:09 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 07 20:51:09 localhost systemd[1]: Finished dracut initqueue hook.
Oct 07 20:51:09 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 07 20:51:09 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 07 20:51:09 localhost systemd[1]: Reached target Remote File Systems.
Oct 07 20:51:09 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 07 20:51:09 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 07 20:51:09 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct 07 20:51:09 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Oct 07 20:51:09 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 07 20:51:09 localhost systemd[1]: Mounting /sysroot...
Oct 07 20:51:10 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 07 20:51:10 localhost kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct 07 20:51:10 localhost kernel: XFS (vda1): Ending clean mount
Oct 07 20:51:10 localhost systemd[1]: Mounted /sysroot.
Oct 07 20:51:10 localhost systemd[1]: Reached target Initrd Root File System.
Oct 07 20:51:10 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 07 20:51:10 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 07 20:51:10 localhost systemd[1]: Reached target Initrd File Systems.
Oct 07 20:51:10 localhost systemd[1]: Reached target Initrd Default Target.
Oct 07 20:51:10 localhost systemd[1]: Starting dracut mount hook...
Oct 07 20:51:10 localhost systemd[1]: Finished dracut mount hook.
Oct 07 20:51:10 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 07 20:51:10 localhost rpc.idmapd[450]: exiting on signal 15
Oct 07 20:51:10 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 07 20:51:10 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 07 20:51:10 localhost systemd[1]: Stopped target Network.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Timer Units.
Oct 07 20:51:10 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 07 20:51:10 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Basic System.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Path Units.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Remote File Systems.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Slice Units.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Socket Units.
Oct 07 20:51:10 localhost systemd[1]: Stopped target System Initialization.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Local File Systems.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Swaps.
Oct 07 20:51:10 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut mount hook.
Oct 07 20:51:10 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 07 20:51:10 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 07 20:51:10 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 07 20:51:10 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 07 20:51:10 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 07 20:51:10 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 07 20:51:10 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 07 20:51:10 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 07 20:51:10 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 07 20:51:10 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 07 20:51:10 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 07 20:51:10 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 07 20:51:10 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Closed udev Control Socket.
Oct 07 20:51:10 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Closed udev Kernel Socket.
Oct 07 20:51:10 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 07 20:51:10 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 07 20:51:10 localhost systemd[1]: Starting Cleanup udev Database...
Oct 07 20:51:10 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 07 20:51:10 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 07 20:51:10 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Stopped Create System Users.
Oct 07 20:51:10 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 07 20:51:10 localhost systemd[1]: Finished Cleanup udev Database.
Oct 07 20:51:10 localhost systemd[1]: Reached target Switch Root.
Oct 07 20:51:10 localhost systemd[1]: Starting Switch Root...
Oct 07 20:51:10 localhost systemd[1]: Switching root.
Oct 07 20:51:10 localhost systemd-journald[306]: Journal stopped
Oct 07 20:51:11 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Oct 07 20:51:11 localhost kernel: audit: type=1404 audit(1759870270.801:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability open_perms=1
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability always_check_network=0
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 20:51:11 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 20:51:11 localhost kernel: audit: type=1403 audit(1759870271.000:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 07 20:51:11 localhost systemd[1]: Successfully loaded SELinux policy in 205.387ms.
Oct 07 20:51:11 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 39.699ms.
Oct 07 20:51:11 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 07 20:51:11 localhost systemd[1]: Detected virtualization kvm.
Oct 07 20:51:11 localhost systemd[1]: Detected architecture x86-64.
Oct 07 20:51:11 localhost systemd-rc-local-generator[641]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 20:51:11 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Stopped Switch Root.
Oct 07 20:51:11 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 07 20:51:11 localhost systemd[1]: Created slice Slice /system/getty.
Oct 07 20:51:11 localhost systemd[1]: Created slice Slice /system/serial-getty.
Oct 07 20:51:11 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 07 20:51:11 localhost systemd[1]: Created slice User and Session Slice.
Oct 07 20:51:11 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 07 20:51:11 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 07 20:51:11 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 07 20:51:11 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 07 20:51:11 localhost systemd[1]: Stopped target Switch Root.
Oct 07 20:51:11 localhost systemd[1]: Stopped target Initrd File Systems.
Oct 07 20:51:11 localhost systemd[1]: Stopped target Initrd Root File System.
Oct 07 20:51:11 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 07 20:51:11 localhost systemd[1]: Reached target Path Units.
Oct 07 20:51:11 localhost systemd[1]: Reached target rpc_pipefs.target.
Oct 07 20:51:11 localhost systemd[1]: Reached target Slice Units.
Oct 07 20:51:11 localhost systemd[1]: Reached target Swaps.
Oct 07 20:51:11 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Oct 07 20:51:11 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 07 20:51:11 localhost systemd[1]: Reached target RPC Port Mapper.
Oct 07 20:51:11 localhost systemd[1]: Listening on Process Core Dump Socket.
Oct 07 20:51:11 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 07 20:51:11 localhost systemd[1]: Listening on udev Control Socket.
Oct 07 20:51:11 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 07 20:51:11 localhost systemd[1]: Mounting Huge Pages File System...
Oct 07 20:51:11 localhost systemd[1]: Mounting POSIX Message Queue File System...
Oct 07 20:51:11 localhost systemd[1]: Mounting Kernel Debug File System...
Oct 07 20:51:11 localhost systemd[1]: Mounting Kernel Trace File System...
Oct 07 20:51:11 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 07 20:51:11 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 07 20:51:11 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 07 20:51:11 localhost systemd[1]: Starting Load Kernel Module drm...
Oct 07 20:51:11 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Oct 07 20:51:11 localhost systemd[1]: Starting Load Kernel Module fuse...
Oct 07 20:51:11 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 07 20:51:11 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Stopped File System Check on Root Device.
Oct 07 20:51:11 localhost systemd[1]: Stopped Journal Service.
Oct 07 20:51:11 localhost systemd[1]: Starting Journal Service...
Oct 07 20:51:11 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 07 20:51:11 localhost systemd[1]: Starting Generate network units from Kernel command line...
Oct 07 20:51:11 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 07 20:51:11 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 07 20:51:11 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 07 20:51:11 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 07 20:51:11 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 07 20:51:11 localhost kernel: fuse: init (API version 7.37)
Oct 07 20:51:11 localhost systemd[1]: Mounted Huge Pages File System.
Oct 07 20:51:11 localhost systemd[1]: Mounted POSIX Message Queue File System.
Oct 07 20:51:11 localhost systemd[1]: Mounted Kernel Debug File System.
Oct 07 20:51:11 localhost systemd[1]: Mounted Kernel Trace File System.
Oct 07 20:51:11 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 07 20:51:11 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 07 20:51:11 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 07 20:51:11 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 07 20:51:11 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Finished Load Kernel Module fuse.
Oct 07 20:51:11 localhost systemd-journald[682]: Journal started
Oct 07 20:51:11 localhost systemd-journald[682]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 07 20:51:11 localhost systemd[1]: Queued start job for default target Multi-User System.
Oct 07 20:51:11 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Started Journal Service.
Oct 07 20:51:11 localhost kernel: ACPI: bus type drm_connector registered
Oct 07 20:51:11 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 07 20:51:11 localhost systemd[1]: Finished Generate network units from Kernel command line.
Oct 07 20:51:11 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 07 20:51:11 localhost systemd[1]: Finished Load Kernel Module drm.
Oct 07 20:51:11 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 07 20:51:11 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 07 20:51:11 localhost systemd[1]: Mounting FUSE Control File System...
Oct 07 20:51:11 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 07 20:51:11 localhost systemd[1]: Starting Rebuild Hardware Database...
Oct 07 20:51:11 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 07 20:51:11 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 07 20:51:11 localhost systemd[1]: Starting Load/Save OS Random Seed...
Oct 07 20:51:11 localhost systemd[1]: Starting Create System Users...
Oct 07 20:51:11 localhost systemd[1]: Mounted FUSE Control File System.
Oct 07 20:51:11 localhost systemd-journald[682]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 07 20:51:11 localhost systemd-journald[682]: Received client request to flush runtime journal.
Oct 07 20:51:11 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 07 20:51:11 localhost systemd[1]: Finished Load/Save OS Random Seed.
Oct 07 20:51:11 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 07 20:51:11 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 07 20:51:11 localhost systemd[1]: Finished Create System Users.
Oct 07 20:51:11 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 07 20:51:11 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 07 20:51:11 localhost systemd[1]: Reached target Preparation for Local File Systems.
Oct 07 20:51:11 localhost systemd[1]: Reached target Local File Systems.
Oct 07 20:51:11 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 07 20:51:11 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 07 20:51:11 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 07 20:51:11 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 07 20:51:11 localhost systemd[1]: Starting Automatic Boot Loader Update...
Oct 07 20:51:11 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 07 20:51:11 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 07 20:51:11 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Oct 07 20:51:11 localhost systemd[1]: Finished Automatic Boot Loader Update.
Oct 07 20:51:12 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 07 20:51:12 localhost systemd[1]: Starting Security Auditing Service...
Oct 07 20:51:12 localhost systemd[1]: Starting RPC Bind...
Oct 07 20:51:12 localhost systemd[1]: Starting Rebuild Journal Catalog...
Oct 07 20:51:12 localhost auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 07 20:51:12 localhost auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 07 20:51:12 localhost systemd[1]: Finished Rebuild Journal Catalog.
Oct 07 20:51:12 localhost systemd[1]: Started RPC Bind.
Oct 07 20:51:12 localhost augenrules[710]: /sbin/augenrules: No change
Oct 07 20:51:12 localhost augenrules[725]: No rules
Oct 07 20:51:12 localhost augenrules[725]: enabled 1
Oct 07 20:51:12 localhost augenrules[725]: failure 1
Oct 07 20:51:12 localhost augenrules[725]: pid 705
Oct 07 20:51:12 localhost augenrules[725]: rate_limit 0
Oct 07 20:51:12 localhost augenrules[725]: backlog_limit 8192
Oct 07 20:51:12 localhost augenrules[725]: lost 0
Oct 07 20:51:12 localhost augenrules[725]: backlog 0
Oct 07 20:51:12 localhost augenrules[725]: backlog_wait_time 60000
Oct 07 20:51:12 localhost augenrules[725]: backlog_wait_time_actual 0
Oct 07 20:51:12 localhost augenrules[725]: enabled 1
Oct 07 20:51:12 localhost augenrules[725]: failure 1
Oct 07 20:51:12 localhost augenrules[725]: pid 705
Oct 07 20:51:12 localhost augenrules[725]: rate_limit 0
Oct 07 20:51:12 localhost augenrules[725]: backlog_limit 8192
Oct 07 20:51:12 localhost augenrules[725]: lost 0
Oct 07 20:51:12 localhost augenrules[725]: backlog 0
Oct 07 20:51:12 localhost augenrules[725]: backlog_wait_time 60000
Oct 07 20:51:12 localhost augenrules[725]: backlog_wait_time_actual 0
Oct 07 20:51:12 localhost augenrules[725]: enabled 1
Oct 07 20:51:12 localhost augenrules[725]: failure 1
Oct 07 20:51:12 localhost augenrules[725]: pid 705
Oct 07 20:51:12 localhost augenrules[725]: rate_limit 0
Oct 07 20:51:12 localhost augenrules[725]: backlog_limit 8192
Oct 07 20:51:12 localhost augenrules[725]: lost 0
Oct 07 20:51:12 localhost augenrules[725]: backlog 0
Oct 07 20:51:12 localhost augenrules[725]: backlog_wait_time 60000
Oct 07 20:51:12 localhost augenrules[725]: backlog_wait_time_actual 0
Oct 07 20:51:12 localhost systemd[1]: Started Security Auditing Service.
Oct 07 20:51:12 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 07 20:51:12 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 07 20:51:12 localhost systemd[1]: Finished Rebuild Hardware Database.
Oct 07 20:51:12 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 07 20:51:12 localhost systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Oct 07 20:51:12 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 07 20:51:12 localhost systemd[1]: Starting Update is Completed...
Oct 07 20:51:12 localhost systemd[1]: Finished Update is Completed.
Oct 07 20:51:12 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 07 20:51:12 localhost systemd[1]: Reached target System Initialization.
Oct 07 20:51:12 localhost systemd[1]: Started dnf makecache --timer.
Oct 07 20:51:12 localhost systemd[1]: Started Daily rotation of log files.
Oct 07 20:51:12 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 07 20:51:12 localhost systemd[1]: Reached target Timer Units.
Oct 07 20:51:12 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 07 20:51:12 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 07 20:51:12 localhost systemd[1]: Reached target Socket Units.
Oct 07 20:51:12 localhost systemd[1]: Starting D-Bus System Message Bus...
Oct 07 20:51:12 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 07 20:51:12 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 07 20:51:12 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 07 20:51:12 localhost systemd-udevd[737]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 20:51:12 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 07 20:51:12 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 07 20:51:12 localhost systemd[1]: Started D-Bus System Message Bus.
Oct 07 20:51:12 localhost systemd[1]: Reached target Basic System.
Oct 07 20:51:12 localhost dbus-broker-lau[763]: Ready
Oct 07 20:51:12 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 07 20:51:12 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 07 20:51:12 localhost systemd[1]: Starting NTP client/server...
Oct 07 20:51:12 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 07 20:51:12 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 07 20:51:12 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 07 20:51:12 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 07 20:51:12 localhost chronyd[788]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 07 20:51:12 localhost systemd[1]: Starting IPv4 firewall with iptables...
Oct 07 20:51:12 localhost chronyd[788]: Loaded 0 symmetric keys
Oct 07 20:51:12 localhost chronyd[788]: Using right/UTC timezone to obtain leap second data
Oct 07 20:51:12 localhost chronyd[788]: Loaded seccomp filter (level 2)
Oct 07 20:51:12 localhost systemd[1]: Started irqbalance daemon.
Oct 07 20:51:12 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 07 20:51:12 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 07 20:51:12 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 07 20:51:12 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 07 20:51:12 localhost systemd[1]: Reached target sshd-keygen.target.
Oct 07 20:51:12 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 07 20:51:12 localhost systemd[1]: Reached target User and Group Name Lookups.
Oct 07 20:51:12 localhost systemd[1]: Starting User Login Management...
Oct 07 20:51:12 localhost systemd[1]: Started NTP client/server.
Oct 07 20:51:12 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 07 20:51:12 localhost kernel: kvm_amd: TSC scaling supported
Oct 07 20:51:12 localhost kernel: kvm_amd: Nested Virtualization enabled
Oct 07 20:51:12 localhost kernel: kvm_amd: Nested Paging enabled
Oct 07 20:51:12 localhost kernel: kvm_amd: LBR virtualization supported
Oct 07 20:51:12 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 07 20:51:12 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 07 20:51:12 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 07 20:51:12 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 07 20:51:12 localhost systemd-logind[798]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 07 20:51:12 localhost systemd-logind[798]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 07 20:51:12 localhost kernel: Console: switching to colour dummy device 80x25
Oct 07 20:51:12 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 07 20:51:12 localhost kernel: [drm] features: -context_init
Oct 07 20:51:12 localhost kernel: [drm] number of scanouts: 1
Oct 07 20:51:12 localhost kernel: [drm] number of cap sets: 0
Oct 07 20:51:12 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 07 20:51:12 localhost systemd-logind[798]: New seat seat0.
Oct 07 20:51:12 localhost systemd[1]: Started User Login Management.
Oct 07 20:51:12 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 07 20:51:12 localhost kernel: Console: switching to colour frame buffer device 128x48
Oct 07 20:51:12 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 07 20:51:12 localhost iptables.init[791]: iptables: Applying firewall rules: [  OK  ]
Oct 07 20:51:12 localhost systemd[1]: Finished IPv4 firewall with iptables.
Oct 07 20:51:13 localhost cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 07 Oct 2025 20:51:13 +0000. Up 6.94 seconds.
Oct 07 20:51:13 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Oct 07 20:51:13 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Oct 07 20:51:13 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp4zf_wad2.mount: Deactivated successfully.
Oct 07 20:51:13 localhost systemd[1]: Starting Hostname Service...
Oct 07 20:51:13 localhost systemd[1]: Started Hostname Service.
Oct 07 20:51:13 np0005474957.novalocal systemd-hostnamed[855]: Hostname set to <np0005474957.novalocal> (static)
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Reached target Preparation for Network.
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Starting Network Manager...
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.8636] NetworkManager (version 1.54.1-1.el9) is starting... (boot:7431bc2e-d322-496b-a062-a84e4f20d15f)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.8641] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.8834] manager[0x55ad57112080]: monitoring kernel firmware directory '/lib/firmware'.
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.8903] hostname: hostname: using hostnamed
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.8904] hostname: static hostname changed from (none) to "np0005474957.novalocal"
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.8910] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9040] manager[0x55ad57112080]: rfkill: Wi-Fi hardware radio set enabled
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9042] manager[0x55ad57112080]: rfkill: WWAN hardware radio set enabled
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9148] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9148] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9149] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9149] manager: Networking is enabled by state file
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9151] settings: Loaded settings plugin: keyfile (internal)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9190] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9216] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9242] dhcp: init: Using DHCP client 'internal'
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9245] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9263] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9276] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9286] device (lo): Activation: starting connection 'lo' (1bf17f26-0348-44a6-aa6b-b5a51eaf7edf)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9296] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9301] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9345] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9349] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9351] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9353] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9355] device (eth0): carrier: link connected
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9358] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9364] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9379] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Started Network Manager.
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9384] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9385] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9387] manager: NetworkManager state is now CONNECTING
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9389] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Reached target Network.
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9396] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9400] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9623] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9627] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 07 20:51:13 np0005474957.novalocal NetworkManager[859]: <info>  [1759870273.9637] device (lo): Activation: successful, device activated.
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Reached target NFS client services.
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: Reached target Remote File Systems.
Oct 07 20:51:13 np0005474957.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6411] dhcp4 (eth0): state changed new lease, address=38.102.83.103
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6421] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6446] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6512] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6513] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6517] manager: NetworkManager state is now CONNECTED_SITE
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6519] device (eth0): Activation: successful, device activated.
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6524] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 07 20:51:15 np0005474957.novalocal NetworkManager[859]: <info>  [1759870275.6525] manager: startup complete
Oct 07 20:51:15 np0005474957.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 07 20:51:15 np0005474957.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Oct 07 20:51:15 np0005474957.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 07 Oct 2025 20:51:15 +0000. Up 9.60 seconds.
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.103         | 255.255.255.0 | global | fa:16:3e:53:ae:bd |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe53:aebd/64 |       .       |  link  | fa:16:3e:53:ae:bd |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Oct 07 20:51:16 np0005474957.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 07 20:51:16 np0005474957.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Oct 07 20:51:16 np0005474957.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Oct 07 20:51:16 np0005474957.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Oct 07 20:51:16 np0005474957.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Oct 07 20:51:16 np0005474957.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Oct 07 20:51:16 np0005474957.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Generating public/private rsa key pair.
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: The key fingerprint is:
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: SHA256:KhHfRMgTyXK8HGWRyk1t3spLeHRuLzOfxv6Td+njmyQ root@np0005474957.novalocal
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: The key's randomart image is:
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: +---[RSA 3072]----+
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |     +.==+       |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |    . Xoo o      |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |    .= B.o .     |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |     o=o. o o    |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |    . . S+ +     |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |     . .. = o    |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |    . .  o o E .o|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |     .    . + **+|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |             B**B|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: +----[SHA256]-----+
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: The key fingerprint is:
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: SHA256:dsiWsrtkdIyCxZPqcUs7lNbxb5hUKFiX//ws3n9QMVM root@np0005474957.novalocal
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: The key's randomart image is:
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: +---[ECDSA 256]---+
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |      . ..      E|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |   . + ...     + |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |    * o ...     +|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |   + + B +.     .|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |  + B = S .o   . |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: | . * = B =  o .  |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |  . + + o o  o . |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |     + . .  ..o .|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |      o.   .....o|
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: +----[SHA256]-----+
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: The key fingerprint is:
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: SHA256:tuyXUqFxCx7qislMWaKc8RZxLim0S8UVk8GdSTN/UgY root@np0005474957.novalocal
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: The key's randomart image is:
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: +--[ED25519 256]--+
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |    .=*+oE.o     |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |  . .o.++ o      |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: | . + .   o .     |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |. o =   + =      |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: | =.+.. oS* o     |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |o.*+o .oo.o      |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |.+oo .  o. .     |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: | +.o  ... o      |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: |  = ..  .o       |
Oct 07 20:51:17 np0005474957.novalocal cloud-init[922]: +----[SHA256]-----+
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Reached target Cloud-config availability.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Reached target Network is Online.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Oct 07 20:51:17 np0005474957.novalocal sm-notify[1004]: Version 2.5.4 starting
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting System Logging Service...
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting OpenSSH server daemon...
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting Permit User Sessions...
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Started Notify NFS peers of a restart.
Oct 07 20:51:17 np0005474957.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Oct 07 20:51:17 np0005474957.novalocal sshd[1006]: Server listening on :: port 22.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Started OpenSSH server daemon.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Finished Permit User Sessions.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Started Command Scheduler.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Started Getty on tty1.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Started Serial Getty on ttyS0.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Reached target Login Prompts.
Oct 07 20:51:17 np0005474957.novalocal crond[1008]: (CRON) STARTUP (1.5.7)
Oct 07 20:51:17 np0005474957.novalocal crond[1008]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 07 20:51:17 np0005474957.novalocal crond[1008]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 76% if used.)
Oct 07 20:51:17 np0005474957.novalocal crond[1008]: (CRON) INFO (running with inotify support)
Oct 07 20:51:17 np0005474957.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Started System Logging Service.
Oct 07 20:51:17 np0005474957.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Reached target Multi-User System.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 07 20:51:17 np0005474957.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1015]: Connection reset by 38.102.83.114 port 43292 [preauth]
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1018]: Unable to negotiate with 38.102.83.114 port 43296: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1023]: Unable to negotiate with 38.102.83.114 port 43304: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1025]: Unable to negotiate with 38.102.83.114 port 43308: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Oct 07 20:51:17 np0005474957.novalocal cloud-init[1026]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 07 Oct 2025 20:51:17 +0000. Up 11.34 seconds.
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1028]: Connection reset by 38.102.83.114 port 43310 [preauth]
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1032]: Unable to negotiate with 38.102.83.114 port 43328: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1021]: Connection closed by 38.102.83.114 port 43298 [preauth]
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1034]: Unable to negotiate with 38.102.83.114 port 43344: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Oct 07 20:51:17 np0005474957.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Oct 07 20:51:17 np0005474957.novalocal sshd-session[1030]: Connection closed by 38.102.83.114 port 43316 [preauth]
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1039]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 07 Oct 2025 20:51:18 +0000. Up 11.76 seconds.
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1041]: #############################################################
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1042]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1044]: 256 SHA256:dsiWsrtkdIyCxZPqcUs7lNbxb5hUKFiX//ws3n9QMVM root@np0005474957.novalocal (ECDSA)
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1046]: 256 SHA256:tuyXUqFxCx7qislMWaKc8RZxLim0S8UVk8GdSTN/UgY root@np0005474957.novalocal (ED25519)
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1048]: 3072 SHA256:KhHfRMgTyXK8HGWRyk1t3spLeHRuLzOfxv6Td+njmyQ root@np0005474957.novalocal (RSA)
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1049]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1050]: #############################################################
Oct 07 20:51:18 np0005474957.novalocal cloud-init[1039]: Cloud-init v. 24.4-7.el9 finished at Tue, 07 Oct 2025 20:51:18 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.00 seconds
Oct 07 20:51:18 np0005474957.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Oct 07 20:51:18 np0005474957.novalocal systemd[1]: Reached target Cloud-init target.
Oct 07 20:51:18 np0005474957.novalocal systemd[1]: Startup finished in 1.569s (kernel) + 2.862s (initrd) + 7.642s (userspace) = 12.074s.
Oct 07 20:51:20 np0005474957.novalocal chronyd[788]: Selected source 172.97.210.214 (2.centos.pool.ntp.org)
Oct 07 20:51:20 np0005474957.novalocal chronyd[788]: System clock TAI offset set to 37 seconds
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: IRQ 25 affinity is now unmanaged
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: IRQ 31 affinity is now unmanaged
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: IRQ 28 affinity is now unmanaged
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: IRQ 32 affinity is now unmanaged
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: IRQ 30 affinity is now unmanaged
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 07 20:51:23 np0005474957.novalocal irqbalance[793]: IRQ 29 affinity is now unmanaged
Oct 07 20:51:25 np0005474957.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 20:51:35 np0005474957.novalocal sshd-session[1054]: Accepted publickey for zuul from 38.102.83.114 port 59352 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Oct 07 20:51:35 np0005474957.novalocal systemd-logind[798]: New session 1 of user zuul.
Oct 07 20:51:35 np0005474957.novalocal systemd[1]: Created slice User Slice of UID 1000.
Oct 07 20:51:35 np0005474957.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 07 20:51:35 np0005474957.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 07 20:51:35 np0005474957.novalocal systemd[1]: Starting User Manager for UID 1000...
Oct 07 20:51:35 np0005474957.novalocal systemd[1058]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Queued start job for default target Main User Target.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Created slice User Application Slice.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Started Daily Cleanup of User's Temporary Directories.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Reached target Paths.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Reached target Timers.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Starting D-Bus User Message Bus Socket...
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Starting Create User's Volatile Files and Directories...
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Finished Create User's Volatile Files and Directories.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Listening on D-Bus User Message Bus Socket.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Reached target Sockets.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Reached target Basic System.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Reached target Main User Target.
Oct 07 20:51:36 np0005474957.novalocal systemd[1058]: Startup finished in 132ms.
Oct 07 20:51:36 np0005474957.novalocal systemd[1]: Started User Manager for UID 1000.
Oct 07 20:51:36 np0005474957.novalocal systemd[1]: Started Session 1 of User zuul.
Oct 07 20:51:36 np0005474957.novalocal sshd-session[1054]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 20:51:36 np0005474957.novalocal python3[1140]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 20:51:39 np0005474957.novalocal python3[1168]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 20:51:43 np0005474957.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 07 20:51:46 np0005474957.novalocal python3[1228]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 20:51:47 np0005474957.novalocal python3[1268]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 07 20:51:49 np0005474957.novalocal python3[1294]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQxEqhfIWAKsCZW3VH/psCN5mCrObX8xn5wz9YktEI1vG2XdaZMyueDluWATMyXqLCAzVVvQdDFOQHF3A8QclFWuuKx83tEQaONfl1wCmU48MCPJ8ahKfFE6svgEsr64/IUozJ/brLvVIVapEuym1UtFfmcAr0/MMoshaJR6q1c83L5jOxVzTDp+UxVmGIl5dAWgqP40va6nhMbAOw766/KzXhuZygc9kwe0ISOnJZjCWEk0UtO+LNNSFQZoU4XkWIOo9n7JsZnPEfErbY+L68RCP8zTBRo7J2/+s/BDfbkhEh8ijn3YXGSCLetlcYUE2pZXm08vq4EK/Qr5S1Cw0O7Qu4kwJEgufo7jHJb0pRXmhjpOpdiBtK+ZRZl1ByFQoM7WvRovgWhj9kGaVjXa9Qea8aztHDMXIypykCYq8Mojbh0oZhcqIs1HGfNMk+uZv90oTU1XAdy94wyMUkOhfNrNk99LKNLR3wia7DXqJLXZP/gGDegixUcXHNTuUTGaM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:51:49 np0005474957.novalocal python3[1318]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:50 np0005474957.novalocal python3[1417]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:51:50 np0005474957.novalocal python3[1488]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759870310.0223396-229-166971532187130/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=3dd9c82b066d4975b644d934a34f4ca1_id_rsa follow=False checksum=7b804799402a015b853275f9346c95ea0e7538d5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:51 np0005474957.novalocal python3[1611]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:51:51 np0005474957.novalocal python3[1682]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759870311.0380714-273-194696757753756/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=3dd9c82b066d4975b644d934a34f4ca1_id_rsa.pub follow=False checksum=f5dab9c187d398ab40cd074e97a1655f3ce87dfc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:53 np0005474957.novalocal python3[1730]: ansible-ping Invoked with data=pong
Oct 07 20:51:54 np0005474957.novalocal python3[1754]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 20:51:56 np0005474957.novalocal python3[1812]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 07 20:51:57 np0005474957.novalocal python3[1844]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:57 np0005474957.novalocal python3[1868]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:57 np0005474957.novalocal python3[1892]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:58 np0005474957.novalocal python3[1916]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:58 np0005474957.novalocal python3[1940]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:51:58 np0005474957.novalocal python3[1964]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:00 np0005474957.novalocal sudo[1988]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqjkyawppvbbyvxkvkyvyypppdcemibn ; /usr/bin/python3'
Oct 07 20:52:00 np0005474957.novalocal sudo[1988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:00 np0005474957.novalocal python3[1990]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:00 np0005474957.novalocal sudo[1988]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:00 np0005474957.novalocal sudo[2066]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhgboqpdeqroqsodrtdefomukkmpjgu ; /usr/bin/python3'
Oct 07 20:52:00 np0005474957.novalocal sudo[2066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:01 np0005474957.novalocal python3[2068]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:52:01 np0005474957.novalocal sudo[2066]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:01 np0005474957.novalocal sudo[2139]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sygmnegxkuyujvkdanrbkggjetaybudy ; /usr/bin/python3'
Oct 07 20:52:01 np0005474957.novalocal sudo[2139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:01 np0005474957.novalocal python3[2141]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759870320.5404117-26-228419178856325/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:01 np0005474957.novalocal sudo[2139]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:02 np0005474957.novalocal python3[2189]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:02 np0005474957.novalocal python3[2213]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:02 np0005474957.novalocal python3[2237]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:03 np0005474957.novalocal python3[2261]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:03 np0005474957.novalocal python3[2285]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:03 np0005474957.novalocal python3[2309]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:03 np0005474957.novalocal python3[2333]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:04 np0005474957.novalocal python3[2357]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:04 np0005474957.novalocal python3[2381]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:04 np0005474957.novalocal python3[2405]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:05 np0005474957.novalocal python3[2429]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:05 np0005474957.novalocal python3[2453]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:05 np0005474957.novalocal python3[2477]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:05 np0005474957.novalocal python3[2501]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:06 np0005474957.novalocal python3[2525]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:06 np0005474957.novalocal python3[2549]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:06 np0005474957.novalocal python3[2573]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:07 np0005474957.novalocal python3[2597]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:07 np0005474957.novalocal python3[2621]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:07 np0005474957.novalocal python3[2645]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:07 np0005474957.novalocal python3[2669]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:08 np0005474957.novalocal python3[2693]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:08 np0005474957.novalocal python3[2717]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:08 np0005474957.novalocal python3[2741]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:09 np0005474957.novalocal python3[2765]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:09 np0005474957.novalocal python3[2789]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 20:52:12 np0005474957.novalocal sudo[2813]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmuvzwwbsxszesctkusimqaorjmvfue ; /usr/bin/python3'
Oct 07 20:52:12 np0005474957.novalocal sudo[2813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:12 np0005474957.novalocal python3[2815]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 07 20:52:12 np0005474957.novalocal systemd[1]: Starting Time & Date Service...
Oct 07 20:52:12 np0005474957.novalocal systemd[1]: Started Time & Date Service.
Oct 07 20:52:12 np0005474957.novalocal systemd-timedated[2817]: Changed time zone to 'UTC' (UTC).
Oct 07 20:52:12 np0005474957.novalocal sudo[2813]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:12 np0005474957.novalocal sudo[2845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-porddeoqzxshqlelzcrvvhlzjgmhbpuk ; /usr/bin/python3'
Oct 07 20:52:12 np0005474957.novalocal sudo[2845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:12 np0005474957.novalocal python3[2847]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:12 np0005474957.novalocal sudo[2845]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:13 np0005474957.novalocal python3[2923]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:52:13 np0005474957.novalocal python3[2994]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759870333.0571275-202-219093852928877/source _original_basename=tmppf6w1uee follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:14 np0005474957.novalocal python3[3094]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:52:14 np0005474957.novalocal python3[3165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759870334.0021863-242-235972110438594/source _original_basename=tmpqvhwtaig follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:15 np0005474957.novalocal sudo[3265]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znvkemabrucfketpavjgmednguhlchiq ; /usr/bin/python3'
Oct 07 20:52:15 np0005474957.novalocal sudo[3265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:15 np0005474957.novalocal python3[3267]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:52:15 np0005474957.novalocal sudo[3265]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:15 np0005474957.novalocal sudo[3338]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwbdqarkldnfscxluqohlnuousgnypkq ; /usr/bin/python3'
Oct 07 20:52:15 np0005474957.novalocal sudo[3338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:16 np0005474957.novalocal python3[3340]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759870335.2921464-306-53173168602378/source _original_basename=tmp6pkkvgce follow=False checksum=b6912f62163ba8caf9fd3d2031d59be4745f014b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:16 np0005474957.novalocal sudo[3338]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:16 np0005474957.novalocal python3[3388]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 20:52:16 np0005474957.novalocal python3[3414]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 20:52:17 np0005474957.novalocal sudo[3492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqpijspekuwqtvambyhqelysrjurbtzo ; /usr/bin/python3'
Oct 07 20:52:17 np0005474957.novalocal sudo[3492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:17 np0005474957.novalocal python3[3494]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:52:17 np0005474957.novalocal sudo[3492]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:17 np0005474957.novalocal sudo[3565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgzaztudruqfnwerjyvfozrtttgmqpbw ; /usr/bin/python3'
Oct 07 20:52:17 np0005474957.novalocal sudo[3565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:17 np0005474957.novalocal python3[3567]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759870337.1464021-362-238306341469625/source _original_basename=tmps0habh0a follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:17 np0005474957.novalocal sudo[3565]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:18 np0005474957.novalocal sudo[3616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fouyjxvikqarfrvhdnjxloacgylzejsc ; /usr/bin/python3'
Oct 07 20:52:18 np0005474957.novalocal sudo[3616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:18 np0005474957.novalocal python3[3618]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-febb-94a4-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 20:52:18 np0005474957.novalocal sudo[3616]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:19 np0005474957.novalocal python3[3646]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-febb-94a4-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 07 20:52:20 np0005474957.novalocal python3[3674]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:42 np0005474957.novalocal sudo[3698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kssivbxoljuwnachkjvoyutcbjggwsxe ; /usr/bin/python3'
Oct 07 20:52:42 np0005474957.novalocal sudo[3698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:52:42 np0005474957.novalocal python3[3700]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:52:42 np0005474957.novalocal sudo[3698]: pam_unix(sudo:session): session closed for user root
Oct 07 20:52:42 np0005474957.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct 07 20:53:17 np0005474957.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct 07 20:53:17 np0005474957.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8424] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 07 20:53:17 np0005474957.novalocal systemd-udevd[3703]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8602] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8630] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8633] device (eth1): carrier: link connected
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8635] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8640] policy: auto-activating connection 'Wired connection 1' (dba2d05c-583f-3e37-8712-a9017933dd6a)
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8644] device (eth1): Activation: starting connection 'Wired connection 1' (dba2d05c-583f-3e37-8712-a9017933dd6a)
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8645] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8647] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8651] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 20:53:17 np0005474957.novalocal NetworkManager[859]: <info>  [1759870397.8655] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 07 20:53:18 np0005474957.novalocal python3[3730]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-77b2-867d-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 20:53:25 np0005474957.novalocal sudo[3808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvpfuspytdeixsxppmrxhqgtigmhbosr ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 07 20:53:25 np0005474957.novalocal sudo[3808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:53:25 np0005474957.novalocal python3[3810]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:53:25 np0005474957.novalocal sudo[3808]: pam_unix(sudo:session): session closed for user root
Oct 07 20:53:26 np0005474957.novalocal sudo[3881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktlzcopwnfrovfrwcqvwncfqhvereej ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 07 20:53:26 np0005474957.novalocal sudo[3881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:53:26 np0005474957.novalocal python3[3883]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759870405.5380144-103-64163080846708/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=5d99afcab61d82fa8595c6e207f3f6d384d08dd3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:53:26 np0005474957.novalocal sudo[3881]: pam_unix(sudo:session): session closed for user root
Oct 07 20:53:26 np0005474957.novalocal sudo[3931]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiydjsaftsctzqcxztrarsetohinsbgn ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 07 20:53:26 np0005474957.novalocal sudo[3931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:53:27 np0005474957.novalocal python3[3933]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Stopped Network Manager Wait Online.
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Stopping Network Manager Wait Online...
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Stopping Network Manager...
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2327] caught SIGTERM, shutting down normally.
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2342] dhcp4 (eth0): canceled DHCP transaction
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2343] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2344] dhcp4 (eth0): state changed no lease
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2347] manager: NetworkManager state is now CONNECTING
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2469] dhcp4 (eth1): canceled DHCP transaction
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2470] dhcp4 (eth1): state changed no lease
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[859]: <info>  [1759870407.2528] exiting (success)
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Stopped Network Manager.
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Starting Network Manager...
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.3465] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:7431bc2e-d322-496b-a062-a84e4f20d15f)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.3470] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.3536] manager[0x55638f083070]: monitoring kernel firmware directory '/lib/firmware'.
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Starting Hostname Service...
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Started Hostname Service.
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4435] hostname: hostname: using hostnamed
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4442] hostname: static hostname changed from (none) to "np0005474957.novalocal"
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4451] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4459] manager[0x55638f083070]: rfkill: Wi-Fi hardware radio set enabled
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4460] manager[0x55638f083070]: rfkill: WWAN hardware radio set enabled
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4512] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4512] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4513] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4515] manager: Networking is enabled by state file
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4519] settings: Loaded settings plugin: keyfile (internal)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4527] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4577] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4597] dhcp: init: Using DHCP client 'internal'
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4602] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4613] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4624] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4641] device (lo): Activation: starting connection 'lo' (1bf17f26-0348-44a6-aa6b-b5a51eaf7edf)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4655] device (eth0): carrier: link connected
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4664] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4677] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4678] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4693] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4704] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4715] device (eth1): carrier: link connected
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4724] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4734] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (dba2d05c-583f-3e37-8712-a9017933dd6a) (indicated)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4735] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4745] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4760] device (eth1): Activation: starting connection 'Wired connection 1' (dba2d05c-583f-3e37-8712-a9017933dd6a)
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Started Network Manager.
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4770] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4781] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4788] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4794] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4800] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4805] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4809] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4812] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4818] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4835] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4840] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4856] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4860] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4886] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4895] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4905] device (lo): Activation: successful, device activated.
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4916] dhcp4 (eth0): state changed new lease, address=38.102.83.103
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.4928] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 07 20:53:27 np0005474957.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.5014] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.5044] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.5047] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.5054] manager: NetworkManager state is now CONNECTED_SITE
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.5060] device (eth0): Activation: successful, device activated.
Oct 07 20:53:27 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870407.5069] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 07 20:53:27 np0005474957.novalocal sudo[3931]: pam_unix(sudo:session): session closed for user root
Oct 07 20:53:27 np0005474957.novalocal python3[4017]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-77b2-867d-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 20:53:37 np0005474957.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 20:53:57 np0005474957.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 07 20:54:08 np0005474957.novalocal systemd[1058]: Starting Mark boot as successful...
Oct 07 20:54:08 np0005474957.novalocal systemd[1058]: Finished Mark boot as successful.
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.3614] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 07 20:54:12 np0005474957.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 20:54:12 np0005474957.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.3967] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.3974] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.3991] device (eth1): Activation: successful, device activated.
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4005] manager: startup complete
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4011] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <warn>  [1759870452.4022] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4037] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4219] dhcp4 (eth1): canceled DHCP transaction
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4220] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4220] dhcp4 (eth1): state changed no lease
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4242] policy: auto-activating connection 'ci-private-network' (6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5)
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4249] device (eth1): Activation: starting connection 'ci-private-network' (6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5)
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4250] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4254] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4264] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4276] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4324] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4326] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 20:54:12 np0005474957.novalocal NetworkManager[3945]: <info>  [1759870452.4336] device (eth1): Activation: successful, device activated.
Oct 07 20:54:22 np0005474957.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 20:54:27 np0005474957.novalocal sshd-session[1067]: Received disconnect from 38.102.83.114 port 59352:11: disconnected by user
Oct 07 20:54:27 np0005474957.novalocal sshd-session[1067]: Disconnected from user zuul 38.102.83.114 port 59352
Oct 07 20:54:27 np0005474957.novalocal sshd-session[1054]: pam_unix(sshd:session): session closed for user zuul
Oct 07 20:54:27 np0005474957.novalocal systemd-logind[798]: Session 1 logged out. Waiting for processes to exit.
Oct 07 20:54:47 np0005474957.novalocal sshd-session[4046]: Accepted publickey for zuul from 38.102.83.114 port 53020 ssh2: RSA SHA256:b0/HvgpjR8EeU2jAPuJSBF71pA7QnSIWfrY1zwtCvY4
Oct 07 20:54:47 np0005474957.novalocal systemd-logind[798]: New session 3 of user zuul.
Oct 07 20:54:47 np0005474957.novalocal systemd[1]: Started Session 3 of User zuul.
Oct 07 20:54:47 np0005474957.novalocal sshd-session[4046]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 20:54:48 np0005474957.novalocal sudo[4125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyskvozybawgqbkftecygurvoekcqmrl ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 07 20:54:48 np0005474957.novalocal sudo[4125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:54:48 np0005474957.novalocal python3[4127]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 20:54:48 np0005474957.novalocal sudo[4125]: pam_unix(sudo:session): session closed for user root
Oct 07 20:54:48 np0005474957.novalocal sudo[4198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdjfwbkitrnoskwwjtsefhxpalaxgzdx ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 07 20:54:48 np0005474957.novalocal sudo[4198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 20:54:48 np0005474957.novalocal python3[4200]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759870487.9506166-309-68568094018854/source _original_basename=tmpmh4nodvd follow=False checksum=95fd6c6d4700b39049c2f1b705a053ad2d7c5dd0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 20:54:48 np0005474957.novalocal sudo[4198]: pam_unix(sudo:session): session closed for user root
Oct 07 20:54:51 np0005474957.novalocal sshd-session[4049]: Connection closed by 38.102.83.114 port 53020
Oct 07 20:54:51 np0005474957.novalocal sshd-session[4046]: pam_unix(sshd:session): session closed for user zuul
Oct 07 20:54:51 np0005474957.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Oct 07 20:54:51 np0005474957.novalocal systemd-logind[798]: Session 3 logged out. Waiting for processes to exit.
Oct 07 20:54:51 np0005474957.novalocal systemd-logind[798]: Removed session 3.
Oct 07 20:57:08 np0005474957.novalocal systemd[1058]: Created slice User Background Tasks Slice.
Oct 07 20:57:08 np0005474957.novalocal systemd[1058]: Starting Cleanup of User's Temporary Files and Directories...
Oct 07 20:57:08 np0005474957.novalocal systemd[1058]: Finished Cleanup of User's Temporary Files and Directories.
Oct 07 20:59:03 np0005474957.novalocal sshd-session[4228]: Received disconnect from 193.46.255.244 port 60134:11:  [preauth]
Oct 07 20:59:03 np0005474957.novalocal sshd-session[4228]: Disconnected from authenticating user root 193.46.255.244 port 60134 [preauth]
Oct 07 21:00:06 np0005474957.novalocal sshd-session[4231]: Accepted publickey for zuul from 38.102.83.114 port 59786 ssh2: RSA SHA256:b0/HvgpjR8EeU2jAPuJSBF71pA7QnSIWfrY1zwtCvY4
Oct 07 21:00:06 np0005474957.novalocal systemd-logind[798]: New session 4 of user zuul.
Oct 07 21:00:06 np0005474957.novalocal systemd[1]: Started Session 4 of User zuul.
Oct 07 21:00:06 np0005474957.novalocal sshd-session[4231]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:00:06 np0005474957.novalocal sudo[4258]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keriuyfebkfcbqpmtmycamnoroxrdvgx ; /usr/bin/python3'
Oct 07 21:00:06 np0005474957.novalocal sudo[4258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:07 np0005474957.novalocal python3[4260]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-c1aa-4771-000000001cf3-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:00:07 np0005474957.novalocal sudo[4258]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:07 np0005474957.novalocal sudo[4286]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saujwdgampjknxmmoeyvsffrzqvwfphm ; /usr/bin/python3'
Oct 07 21:00:07 np0005474957.novalocal sudo[4286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:07 np0005474957.novalocal python3[4288]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:00:07 np0005474957.novalocal sudo[4286]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:07 np0005474957.novalocal sudo[4312]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tabzxgyyioqanmiqpqpiiimcgntxissw ; /usr/bin/python3'
Oct 07 21:00:07 np0005474957.novalocal sudo[4312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:07 np0005474957.novalocal python3[4314]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:00:07 np0005474957.novalocal sudo[4312]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:07 np0005474957.novalocal sudo[4339]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcsomopcajupwjhoticepqksqlglmopk ; /usr/bin/python3'
Oct 07 21:00:07 np0005474957.novalocal sudo[4339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:08 np0005474957.novalocal python3[4341]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:00:08 np0005474957.novalocal sudo[4339]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:08 np0005474957.novalocal sudo[4365]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbqtxshbgsvkhmnesllmxymqwwrrnwxv ; /usr/bin/python3'
Oct 07 21:00:08 np0005474957.novalocal sudo[4365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:08 np0005474957.novalocal python3[4367]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:00:08 np0005474957.novalocal sudo[4365]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:08 np0005474957.novalocal sudo[4391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giriqbuhmhxsohuwdxfewohbtvhyhogo ; /usr/bin/python3'
Oct 07 21:00:08 np0005474957.novalocal sudo[4391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:08 np0005474957.novalocal python3[4393]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:00:08 np0005474957.novalocal python3[4393]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 07 21:00:08 np0005474957.novalocal sudo[4391]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:09 np0005474957.novalocal sudo[4417]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgcyplptiyvdwjcmhmjoouzkymvlfdpm ; /usr/bin/python3'
Oct 07 21:00:09 np0005474957.novalocal sudo[4417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:09 np0005474957.novalocal python3[4419]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:00:09 np0005474957.novalocal systemd[1]: Reloading.
Oct 07 21:00:09 np0005474957.novalocal systemd-rc-local-generator[4440]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:00:10 np0005474957.novalocal sudo[4417]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:11 np0005474957.novalocal sudo[4473]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzgtysobszgitulzlbvvduxfhtyhsxck ; /usr/bin/python3'
Oct 07 21:00:11 np0005474957.novalocal sudo[4473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:11 np0005474957.novalocal python3[4475]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 07 21:00:11 np0005474957.novalocal sudo[4473]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:11 np0005474957.novalocal sudo[4499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqltmxuyzkxokealahnvksgblhrdvtwo ; /usr/bin/python3'
Oct 07 21:00:11 np0005474957.novalocal sudo[4499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:11 np0005474957.novalocal python3[4501]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:00:11 np0005474957.novalocal sudo[4499]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:12 np0005474957.novalocal sudo[4527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvbpsnaupevlzrsanlfvtngtqkkcycsx ; /usr/bin/python3'
Oct 07 21:00:12 np0005474957.novalocal sudo[4527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:12 np0005474957.novalocal python3[4529]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:00:12 np0005474957.novalocal sudo[4527]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:12 np0005474957.novalocal sudo[4556]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oofszhfrxbfkfahpuwjsejzrsolckdrl ; /usr/bin/python3'
Oct 07 21:00:12 np0005474957.novalocal sudo[4556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:12 np0005474957.novalocal python3[4558]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:00:12 np0005474957.novalocal sudo[4556]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:12 np0005474957.novalocal sudo[4584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehttvaquxcaylqsormempvzdxxnppuor ; /usr/bin/python3'
Oct 07 21:00:12 np0005474957.novalocal sudo[4584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:12 np0005474957.novalocal python3[4586]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:00:12 np0005474957.novalocal sudo[4584]: pam_unix(sudo:session): session closed for user root
Oct 07 21:00:13 np0005474957.novalocal python3[4613]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-c1aa-4771-000000001cf9-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:00:13 np0005474957.novalocal python3[4643]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:00:16 np0005474957.novalocal sshd-session[4234]: Connection closed by 38.102.83.114 port 59786
Oct 07 21:00:16 np0005474957.novalocal sshd-session[4231]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:00:16 np0005474957.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Oct 07 21:00:16 np0005474957.novalocal systemd[1]: session-4.scope: Consumed 3.860s CPU time.
Oct 07 21:00:16 np0005474957.novalocal systemd-logind[798]: Session 4 logged out. Waiting for processes to exit.
Oct 07 21:00:16 np0005474957.novalocal systemd-logind[798]: Removed session 4.
Oct 07 21:00:18 np0005474957.novalocal sshd-session[4648]: Accepted publickey for zuul from 38.102.83.114 port 42310 ssh2: RSA SHA256:b0/HvgpjR8EeU2jAPuJSBF71pA7QnSIWfrY1zwtCvY4
Oct 07 21:00:18 np0005474957.novalocal systemd-logind[798]: New session 5 of user zuul.
Oct 07 21:00:18 np0005474957.novalocal systemd[1]: Started Session 5 of User zuul.
Oct 07 21:00:18 np0005474957.novalocal sshd-session[4648]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:00:18 np0005474957.novalocal sudo[4675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffkxwjqlidfapukytdqxvmnmgaikoyij ; /usr/bin/python3'
Oct 07 21:00:18 np0005474957.novalocal sudo[4675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:00:18 np0005474957.novalocal python3[4677]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:00:32 np0005474957.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:00:41 np0005474957.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:00:52 np0005474957.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:00:53 np0005474957.novalocal setsebool[4741]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 07 21:00:53 np0005474957.novalocal setsebool[4741]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 07 21:01:01 np0005474957.novalocal CROND[4751]: (root) CMD (run-parts /etc/cron.hourly)
Oct 07 21:01:01 np0005474957.novalocal run-parts[4754]: (/etc/cron.hourly) starting 0anacron
Oct 07 21:01:01 np0005474957.novalocal anacron[4762]: Anacron started on 2025-10-07
Oct 07 21:01:01 np0005474957.novalocal anacron[4762]: Will run job `cron.daily' in 50 min.
Oct 07 21:01:01 np0005474957.novalocal anacron[4762]: Will run job `cron.weekly' in 70 min.
Oct 07 21:01:01 np0005474957.novalocal anacron[4762]: Will run job `cron.monthly' in 90 min.
Oct 07 21:01:01 np0005474957.novalocal anacron[4762]: Jobs will be executed sequentially
Oct 07 21:01:01 np0005474957.novalocal run-parts[4764]: (/etc/cron.hourly) finished 0anacron
Oct 07 21:01:01 np0005474957.novalocal CROND[4750]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  Converting 367 SID table entries...
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:01:06 np0005474957.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:01:24 np0005474957.novalocal dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 07 21:01:24 np0005474957.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:01:25 np0005474957.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:01:25 np0005474957.novalocal systemd[1]: Reloading.
Oct 07 21:01:25 np0005474957.novalocal systemd-rc-local-generator[5512]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:01:25 np0005474957.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:01:25 np0005474957.novalocal systemd[1]: Starting PackageKit Daemon...
Oct 07 21:01:25 np0005474957.novalocal PackageKit[6085]: daemon start
Oct 07 21:01:26 np0005474957.novalocal systemd[1]: Starting Authorization Manager...
Oct 07 21:01:26 np0005474957.novalocal polkitd[6157]: Started polkitd version 0.117
Oct 07 21:01:26 np0005474957.novalocal polkitd[6157]: Loading rules from directory /etc/polkit-1/rules.d
Oct 07 21:01:26 np0005474957.novalocal polkitd[6157]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 07 21:01:26 np0005474957.novalocal polkitd[6157]: Finished loading, compiling and executing 3 rules
Oct 07 21:01:26 np0005474957.novalocal systemd[1]: Started Authorization Manager.
Oct 07 21:01:26 np0005474957.novalocal polkitd[6157]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 07 21:01:26 np0005474957.novalocal systemd[1]: Started PackageKit Daemon.
Oct 07 21:01:26 np0005474957.novalocal sudo[4675]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:27 np0005474957.novalocal python3[6864]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-bf29-872d-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:01:27 np0005474957.novalocal kernel: evm: overlay not supported
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: Starting D-Bus User Message Bus...
Oct 07 21:01:28 np0005474957.novalocal dbus-broker-launch[7681]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 07 21:01:28 np0005474957.novalocal dbus-broker-launch[7681]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: Started D-Bus User Message Bus.
Oct 07 21:01:28 np0005474957.novalocal dbus-broker-lau[7681]: Ready
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: Created slice Slice /user.
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: podman-7511.scope: unit configures an IP firewall, but not running as root.
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: (This warning is only shown for the first unit using IP firewalling.)
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: Started podman-7511.scope.
Oct 07 21:01:28 np0005474957.novalocal systemd[1058]: Started podman-pause-c92080a6.scope.
Oct 07 21:01:28 np0005474957.novalocal sudo[8351]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbqrsrymqpfkkoinslyrqdokrlfhvwye ; /usr/bin/python3'
Oct 07 21:01:28 np0005474957.novalocal sudo[8351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:29 np0005474957.novalocal python3[8377]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                      location = "38.102.83.12:5001"
                                                      insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                      location = "38.102.83.12:5001"
                                                      insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:01:29 np0005474957.novalocal sudo[8351]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:29 np0005474957.novalocal sshd-session[4651]: Connection closed by 38.102.83.114 port 42310
Oct 07 21:01:29 np0005474957.novalocal sshd-session[4648]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:01:29 np0005474957.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Oct 07 21:01:29 np0005474957.novalocal systemd[1]: session-5.scope: Consumed 1min 4.675s CPU time.
Oct 07 21:01:29 np0005474957.novalocal systemd-logind[798]: Session 5 logged out. Waiting for processes to exit.
Oct 07 21:01:29 np0005474957.novalocal systemd-logind[798]: Removed session 5.
Oct 07 21:01:33 np0005474957.novalocal sshd-session[10059]: Invalid user elasticsearch from 103.115.24.11 port 45106
Oct 07 21:01:33 np0005474957.novalocal sshd-session[10059]: Received disconnect from 103.115.24.11 port 45106:11: Bye Bye [preauth]
Oct 07 21:01:33 np0005474957.novalocal sshd-session[10059]: Disconnected from invalid user elasticsearch 103.115.24.11 port 45106 [preauth]
Oct 07 21:01:49 np0005474957.novalocal sshd-session[15279]: Unable to negotiate with 38.102.83.23 port 33452: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 07 21:01:49 np0005474957.novalocal sshd-session[15280]: Connection closed by 38.102.83.23 port 33434 [preauth]
Oct 07 21:01:49 np0005474957.novalocal sshd-session[15281]: Connection closed by 38.102.83.23 port 33436 [preauth]
Oct 07 21:01:49 np0005474957.novalocal sshd-session[15283]: Unable to negotiate with 38.102.83.23 port 33438: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 07 21:01:49 np0005474957.novalocal sshd-session[15284]: Unable to negotiate with 38.102.83.23 port 33462: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 07 21:01:54 np0005474957.novalocal sshd-session[16904]: Accepted publickey for zuul from 38.102.83.114 port 55618 ssh2: RSA SHA256:b0/HvgpjR8EeU2jAPuJSBF71pA7QnSIWfrY1zwtCvY4
Oct 07 21:01:54 np0005474957.novalocal systemd-logind[798]: New session 6 of user zuul.
Oct 07 21:01:54 np0005474957.novalocal systemd[1]: Started Session 6 of User zuul.
Oct 07 21:01:54 np0005474957.novalocal sshd-session[16904]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:01:54 np0005474957.novalocal python3[16997]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPylehahPOwZLGTOxKZYO+oVSwxLhAJto71u9BW87u1qrQnu5vsi4YaO8jMCAoDq1P5mR1QMQ9B4gKxcEiHNXuA= zuul@np0005474956.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 21:01:54 np0005474957.novalocal sudo[17141]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqffjxdlzmbgslymtxcoejvtedtldemp ; /usr/bin/python3'
Oct 07 21:01:54 np0005474957.novalocal sudo[17141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:55 np0005474957.novalocal python3[17151]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPylehahPOwZLGTOxKZYO+oVSwxLhAJto71u9BW87u1qrQnu5vsi4YaO8jMCAoDq1P5mR1QMQ9B4gKxcEiHNXuA= zuul@np0005474956.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 21:01:55 np0005474957.novalocal sudo[17141]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:55 np0005474957.novalocal sudo[17445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktyxhjjaupscfpcblgiicwwkzdvndirf ; /usr/bin/python3'
Oct 07 21:01:55 np0005474957.novalocal sudo[17445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:56 np0005474957.novalocal python3[17455]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005474957.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 07 21:01:56 np0005474957.novalocal useradd[17509]: new group: name=cloud-admin, GID=1002
Oct 07 21:01:56 np0005474957.novalocal useradd[17509]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Oct 07 21:01:56 np0005474957.novalocal sudo[17445]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:56 np0005474957.novalocal sudo[17615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djbrhzecsyclxsvivwnfkkyimnvackhs ; /usr/bin/python3'
Oct 07 21:01:56 np0005474957.novalocal sudo[17615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:56 np0005474957.novalocal python3[17623]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPylehahPOwZLGTOxKZYO+oVSwxLhAJto71u9BW87u1qrQnu5vsi4YaO8jMCAoDq1P5mR1QMQ9B4gKxcEiHNXuA= zuul@np0005474956.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 07 21:01:56 np0005474957.novalocal sudo[17615]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:56 np0005474957.novalocal sudo[17844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amiznmsdseibqqsbhiojgoqrempcbdjz ; /usr/bin/python3'
Oct 07 21:01:56 np0005474957.novalocal sudo[17844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:57 np0005474957.novalocal python3[17852]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:01:57 np0005474957.novalocal sudo[17844]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:57 np0005474957.novalocal sudo[18080]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kefjyphuxblmllwbbxfhxvohqlbjesrk ; /usr/bin/python3'
Oct 07 21:01:57 np0005474957.novalocal sudo[18080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:57 np0005474957.novalocal python3[18087]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759870916.7381084-151-143715498606390/source _original_basename=tmpsfa3n4nb follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:01:57 np0005474957.novalocal sudo[18080]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:58 np0005474957.novalocal sudo[18353]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfmshhdsaydgogdbjogiwcqkkrbtyjxe ; /usr/bin/python3'
Oct 07 21:01:58 np0005474957.novalocal sudo[18353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:01:58 np0005474957.novalocal python3[18364]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Oct 07 21:01:58 np0005474957.novalocal systemd[1]: Starting Hostname Service...
Oct 07 21:01:58 np0005474957.novalocal systemd[1]: Started Hostname Service.
Oct 07 21:01:58 np0005474957.novalocal systemd-hostnamed[18465]: Changed pretty hostname to 'compute-0'
Oct 07 21:01:58 compute-0 systemd-hostnamed[18465]: Hostname set to <compute-0> (static)
Oct 07 21:01:58 compute-0 NetworkManager[3945]: <info>  [1759870918.7145] hostname: static hostname changed from "np0005474957.novalocal" to "compute-0"
Oct 07 21:01:58 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 21:01:58 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 21:01:58 compute-0 sudo[18353]: pam_unix(sudo:session): session closed for user root
Oct 07 21:01:59 compute-0 sshd-session[16944]: Connection closed by 38.102.83.114 port 55618
Oct 07 21:01:59 compute-0 sshd-session[16904]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:01:59 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Oct 07 21:01:59 compute-0 systemd[1]: session-6.scope: Consumed 2.618s CPU time.
Oct 07 21:01:59 compute-0 systemd-logind[798]: Session 6 logged out. Waiting for processes to exit.
Oct 07 21:01:59 compute-0 systemd-logind[798]: Removed session 6.
Oct 07 21:02:03 compute-0 irqbalance[793]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 07 21:02:03 compute-0 irqbalance[793]: IRQ 27 affinity is now unmanaged
Oct 07 21:02:08 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 21:02:25 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:02:25 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:02:25 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 14.525s CPU time.
Oct 07 21:02:25 compute-0 systemd[1]: run-r356b90f6c81544aa963e30702ee8bacd.service: Deactivated successfully.
Oct 07 21:02:28 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 07 21:05:27 compute-0 sshd-session[26569]: Accepted publickey for zuul from 38.102.83.23 port 57216 ssh2: RSA SHA256:b0/HvgpjR8EeU2jAPuJSBF71pA7QnSIWfrY1zwtCvY4
Oct 07 21:05:27 compute-0 systemd-logind[798]: New session 7 of user zuul.
Oct 07 21:05:27 compute-0 systemd[1]: Started Session 7 of User zuul.
Oct 07 21:05:27 compute-0 sshd-session[26569]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:05:28 compute-0 python3[26645]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:05:29 compute-0 sudo[26759]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxoocsxhjatvwzphfpghxhjujzdsjujq ; /usr/bin/python3'
Oct 07 21:05:29 compute-0 sudo[26759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:29 compute-0 python3[26761]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:29 compute-0 sudo[26759]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:30 compute-0 sudo[26832]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tshjyymwfjtqdafvrhfylcifutxrgfat ; /usr/bin/python3'
Oct 07 21:05:30 compute-0 sudo[26832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:30 compute-0 python3[26834]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=delorean.repo follow=False checksum=5293c0d9a8442784aebe60a17200da28f716313c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:30 compute-0 sudo[26832]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:30 compute-0 sudo[26858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwoemjguqdkdlqowduuefbfxgftsncby ; /usr/bin/python3'
Oct 07 21:05:30 compute-0 sudo[26858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:30 compute-0 python3[26860]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:30 compute-0 sudo[26858]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:30 compute-0 sudo[26931]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbuujjcbqqkkdswgfvngquthzqebyigg ; /usr/bin/python3'
Oct 07 21:05:30 compute-0 sudo[26931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:30 compute-0 python3[26933]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:30 compute-0 sudo[26931]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:30 compute-0 sudo[26957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkgzippsdzmeenucbkbsyjilopkvunsy ; /usr/bin/python3'
Oct 07 21:05:30 compute-0 sudo[26957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:31 compute-0 python3[26959]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:31 compute-0 sudo[26957]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:31 compute-0 sudo[27030]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcypbqtphpldmasrlhlgppnkjfcncndg ; /usr/bin/python3'
Oct 07 21:05:31 compute-0 sudo[27030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:31 compute-0 python3[27032]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:31 compute-0 sudo[27030]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:31 compute-0 sudo[27056]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwntbtokemyfmwyrostrtedhmqmsatar ; /usr/bin/python3'
Oct 07 21:05:31 compute-0 sudo[27056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:31 compute-0 python3[27058]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:31 compute-0 sudo[27056]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:31 compute-0 sudo[27129]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qstnxtslobnenaermsdfxannrwtuotom ; /usr/bin/python3'
Oct 07 21:05:31 compute-0 sudo[27129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:32 compute-0 python3[27131]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:32 compute-0 sudo[27129]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:32 compute-0 sudo[27155]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lebjjxpcxlqsbeeccfcygiwhlpthhtkd ; /usr/bin/python3'
Oct 07 21:05:32 compute-0 sudo[27155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:32 compute-0 python3[27157]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:32 compute-0 sudo[27155]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:32 compute-0 sudo[27228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roqhdwbwafhttiuhoqnscgnbbmizegbl ; /usr/bin/python3'
Oct 07 21:05:32 compute-0 sudo[27228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:32 compute-0 python3[27230]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:32 compute-0 sudo[27228]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:32 compute-0 sudo[27254]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysfqxlrcrjiamqrcuafvrydozyueobmd ; /usr/bin/python3'
Oct 07 21:05:32 compute-0 sudo[27254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:32 compute-0 python3[27256]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:32 compute-0 sudo[27254]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:33 compute-0 sudo[27327]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiejirxcepthynjdtgadvgtftujrrwnu ; /usr/bin/python3'
Oct 07 21:05:33 compute-0 sudo[27327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:33 compute-0 python3[27329]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:33 compute-0 sudo[27327]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:33 compute-0 sudo[27353]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlcxrsprzprdvgyfsarsdlegltiewomu ; /usr/bin/python3'
Oct 07 21:05:33 compute-0 sudo[27353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:33 compute-0 python3[27355]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:33 compute-0 sudo[27353]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:33 compute-0 sudo[27426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xleolveepsammwriejqyuismvikyvuqg ; /usr/bin/python3'
Oct 07 21:05:33 compute-0 sudo[27426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:33 compute-0 python3[27428]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=84bbbc6725cf608ee80f0b149ec7ae70d3eebcf5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:33 compute-0 sudo[27426]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:34 compute-0 sudo[27452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqzrbcoqivxawtmoamhleuocohlzplsi ; /usr/bin/python3'
Oct 07 21:05:34 compute-0 sudo[27452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:34 compute-0 python3[27454]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 07 21:05:34 compute-0 sudo[27452]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:34 compute-0 sudo[27525]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xleuzscqkyvoxvofwgnvwsummyiwtcwd ; /usr/bin/python3'
Oct 07 21:05:34 compute-0 sudo[27525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:05:34 compute-0 python3[27527]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759871129.3399425-30455-169513445691600/source mode=0755 _original_basename=gating.repo follow=False checksum=1a79982167d42a022e2376aa48befb42d92ab51f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:05:34 compute-0 sudo[27525]: pam_unix(sudo:session): session closed for user root
Oct 07 21:05:37 compute-0 sshd-session[27552]: Connection closed by 192.168.122.11 port 46492 [preauth]
Oct 07 21:05:37 compute-0 sshd-session[27553]: Unable to negotiate with 192.168.122.11 port 46516: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 07 21:05:37 compute-0 sshd-session[27554]: Unable to negotiate with 192.168.122.11 port 46502: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 07 21:05:37 compute-0 sshd-session[27555]: Connection closed by 192.168.122.11 port 46476 [preauth]
Oct 07 21:05:37 compute-0 sshd-session[27557]: Unable to negotiate with 192.168.122.11 port 46500: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 07 21:06:08 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 07 21:06:08 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 07 21:06:08 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 07 21:06:08 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 07 21:06:31 compute-0 PackageKit[6085]: daemon quit
Oct 07 21:06:31 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 07 21:06:45 compute-0 python3[27589]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:07:17 compute-0 sshd-session[27591]: Invalid user teamspeak3 from 103.115.24.11 port 45684
Oct 07 21:07:17 compute-0 sshd-session[27591]: Received disconnect from 103.115.24.11 port 45684:11: Bye Bye [preauth]
Oct 07 21:07:17 compute-0 sshd-session[27591]: Disconnected from invalid user teamspeak3 103.115.24.11 port 45684 [preauth]
Oct 07 21:08:12 compute-0 sshd-session[27594]: Received disconnect from 193.46.255.244 port 13754:11:  [preauth]
Oct 07 21:08:12 compute-0 sshd-session[27594]: Disconnected from authenticating user root 193.46.255.244 port 13754 [preauth]
Oct 07 21:11:15 compute-0 sshd-session[27596]: Invalid user teamspeak3 from 103.115.24.11 port 38964
Oct 07 21:11:16 compute-0 sshd-session[27596]: Received disconnect from 103.115.24.11 port 38964:11: Bye Bye [preauth]
Oct 07 21:11:16 compute-0 sshd-session[27596]: Disconnected from invalid user teamspeak3 103.115.24.11 port 38964 [preauth]
Oct 07 21:11:45 compute-0 sshd-session[26572]: Received disconnect from 38.102.83.23 port 57216:11: disconnected by user
Oct 07 21:11:45 compute-0 sshd-session[26572]: Disconnected from user zuul 38.102.83.23 port 57216
Oct 07 21:11:45 compute-0 sshd-session[26569]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:11:45 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Oct 07 21:11:45 compute-0 systemd[1]: session-7.scope: Consumed 6.073s CPU time.
Oct 07 21:11:45 compute-0 systemd-logind[798]: Session 7 logged out. Waiting for processes to exit.
Oct 07 21:11:45 compute-0 systemd-logind[798]: Removed session 7.
Oct 07 21:15:14 compute-0 sshd-session[27601]: Invalid user student from 103.115.24.11 port 44712
Oct 07 21:15:15 compute-0 sshd-session[27601]: Received disconnect from 103.115.24.11 port 44712:11: Bye Bye [preauth]
Oct 07 21:15:15 compute-0 sshd-session[27601]: Disconnected from invalid user student 103.115.24.11 port 44712 [preauth]
Oct 07 21:17:11 compute-0 sshd-session[27603]: Received disconnect from 193.46.255.217 port 41750:11:  [preauth]
Oct 07 21:17:11 compute-0 sshd-session[27603]: Disconnected from authenticating user root 193.46.255.217 port 41750 [preauth]
Oct 07 21:18:18 compute-0 sshd-session[27605]: Accepted publickey for zuul from 192.168.122.30 port 33464 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:18:18 compute-0 systemd-logind[798]: New session 8 of user zuul.
Oct 07 21:18:18 compute-0 systemd[1]: Started Session 8 of User zuul.
Oct 07 21:18:18 compute-0 sshd-session[27605]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:18:19 compute-0 python3.9[27758]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:18:20 compute-0 sudo[27937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqrpwkiqfoaigteamwdidjyylmztmrqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871900.2801301-44-146501999969683/AnsiballZ_command.py'
Oct 07 21:18:20 compute-0 sudo[27937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:18:20 compute-0 python3.9[27939]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:18:29 compute-0 sudo[27937]: pam_unix(sudo:session): session closed for user root
Oct 07 21:18:29 compute-0 sshd-session[27608]: Connection closed by 192.168.122.30 port 33464
Oct 07 21:18:29 compute-0 sshd-session[27605]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:18:29 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Oct 07 21:18:29 compute-0 systemd[1]: session-8.scope: Consumed 8.118s CPU time.
Oct 07 21:18:29 compute-0 systemd-logind[798]: Session 8 logged out. Waiting for processes to exit.
Oct 07 21:18:29 compute-0 systemd-logind[798]: Removed session 8.
Oct 07 21:18:35 compute-0 sshd-session[27999]: Accepted publickey for zuul from 192.168.122.30 port 35084 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:18:35 compute-0 systemd-logind[798]: New session 9 of user zuul.
Oct 07 21:18:35 compute-0 systemd[1]: Started Session 9 of User zuul.
Oct 07 21:18:35 compute-0 sshd-session[27999]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:18:36 compute-0 python3.9[28152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:18:36 compute-0 sshd-session[28002]: Connection closed by 192.168.122.30 port 35084
Oct 07 21:18:36 compute-0 sshd-session[27999]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:18:36 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Oct 07 21:18:36 compute-0 systemd-logind[798]: Session 9 logged out. Waiting for processes to exit.
Oct 07 21:18:36 compute-0 systemd-logind[798]: Removed session 9.
Oct 07 21:18:52 compute-0 sshd-session[28181]: Accepted publickey for zuul from 192.168.122.30 port 43490 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:18:52 compute-0 systemd-logind[798]: New session 10 of user zuul.
Oct 07 21:18:52 compute-0 systemd[1]: Started Session 10 of User zuul.
Oct 07 21:18:52 compute-0 sshd-session[28181]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:18:53 compute-0 python3.9[28334]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 07 21:18:54 compute-0 python3.9[28508]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:18:55 compute-0 sudo[28658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipnltffdwrpgajgjgzsvyhqacffxbven ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871934.893323-69-60203724277665/AnsiballZ_command.py'
Oct 07 21:18:55 compute-0 sudo[28658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:18:55 compute-0 python3.9[28660]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:18:55 compute-0 sudo[28658]: pam_unix(sudo:session): session closed for user root
Oct 07 21:18:56 compute-0 sudo[28811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdfpwyjyfnhprwpmdruynlmuuuurfwdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871936.078345-93-175828895879008/AnsiballZ_stat.py'
Oct 07 21:18:56 compute-0 sudo[28811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:18:56 compute-0 python3.9[28813]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:18:56 compute-0 sudo[28811]: pam_unix(sudo:session): session closed for user root
Oct 07 21:18:57 compute-0 sudo[28963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btpsbpmapsilwxpowlfzhncfsvybthsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871937.0671208-109-54723204682750/AnsiballZ_file.py'
Oct 07 21:18:57 compute-0 sudo[28963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:18:57 compute-0 python3.9[28965]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:18:57 compute-0 sudo[28963]: pam_unix(sudo:session): session closed for user root
Oct 07 21:18:58 compute-0 sudo[29115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmtridivxcodfkkhgsqqapzhsivmtiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871937.9906015-125-190264763373220/AnsiballZ_stat.py'
Oct 07 21:18:58 compute-0 sudo[29115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:18:58 compute-0 python3.9[29117]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:18:58 compute-0 sudo[29115]: pam_unix(sudo:session): session closed for user root
Oct 07 21:18:59 compute-0 sudo[29238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dokyzessnqjstdowsbycvctvksgokznf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871937.9906015-125-190264763373220/AnsiballZ_copy.py'
Oct 07 21:18:59 compute-0 sudo[29238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:18:59 compute-0 python3.9[29240]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759871937.9906015-125-190264763373220/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:18:59 compute-0 sudo[29238]: pam_unix(sudo:session): session closed for user root
Oct 07 21:18:59 compute-0 sudo[29390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktmmlvstyzqkctdyomgqkhtjlwzhihyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871939.4079285-155-11098997516316/AnsiballZ_setup.py'
Oct 07 21:18:59 compute-0 sudo[29390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:19:00 compute-0 python3.9[29392]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:19:00 compute-0 sudo[29390]: pam_unix(sudo:session): session closed for user root
Oct 07 21:19:00 compute-0 sudo[29546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slzxmiaysozowugvksorjjxoyjqkrdrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871940.5087142-171-110892977123141/AnsiballZ_file.py'
Oct 07 21:19:00 compute-0 sudo[29546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:19:01 compute-0 python3.9[29548]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:19:01 compute-0 sudo[29546]: pam_unix(sudo:session): session closed for user root
Oct 07 21:19:02 compute-0 python3.9[29698]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:19:07 compute-0 python3.9[29953]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:19:08 compute-0 python3.9[30103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:19:09 compute-0 python3.9[30259]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:19:10 compute-0 sudo[30415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvjyfeukpxeobuhofkkfkidqprmqjqvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871950.177657-267-105950206156797/AnsiballZ_setup.py'
Oct 07 21:19:10 compute-0 sudo[30415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:19:10 compute-0 python3.9[30417]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:19:11 compute-0 sudo[30415]: pam_unix(sudo:session): session closed for user root
Oct 07 21:19:11 compute-0 sshd-session[30104]: Invalid user student from 103.115.24.11 port 55098
Oct 07 21:19:11 compute-0 sudo[30499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzukbmgwlgexjwieguznnzhdgbvvdigx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759871950.177657-267-105950206156797/AnsiballZ_dnf.py'
Oct 07 21:19:11 compute-0 sudo[30499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:19:11 compute-0 sshd-session[30104]: Received disconnect from 103.115.24.11 port 55098:11: Bye Bye [preauth]
Oct 07 21:19:11 compute-0 sshd-session[30104]: Disconnected from invalid user student 103.115.24.11 port 55098 [preauth]
Oct 07 21:19:11 compute-0 python3.9[30501]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:19:55 compute-0 systemd[1]: Reloading.
Oct 07 21:19:56 compute-0 systemd-rc-local-generator[30686]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:19:56 compute-0 systemd[1]: Starting dnf makecache...
Oct 07 21:19:56 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 07 21:19:56 compute-0 dnf[30703]: Repository 'gating-repo' is missing name in configuration, using id.
Oct 07 21:19:56 compute-0 dnf[30703]: Failed determining last makecache time.
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-barbican-42b4c41831408a8e323 138 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 159 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-cinder-1c00d6490d88e436f26ef 177 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-python-stevedore-c4acc5639fd2329372142 175 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-python-cloudkitty-tests-tempest-3961dc 168 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-diskimage-builder-43381184423c185801b5 165 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 163 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-python-designate-tests-tempest-347fdbc 176 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-glance-1fd12c29b339f30fe823e 163 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 165 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-manila-3c01b7181572c95dac462 168 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-python-vmware-nsxlib-458234972d1428ac9 164 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-octavia-ba397f07a7331190208c 172 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-watcher-c014f81a8647287f6dcc 152 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-edpm-image-builder-55ba53cf215b14ed95b 131 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 systemd[1]: Reloading.
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 140 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-swift-dc98a8463506ac520c469a 124 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-python-tempestconf-8515371b7cceebd4282 126 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 systemd-rc-local-generator[30745]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:19:56 compute-0 dnf[30703]: delorean-openstack-heat-ui-013accbfd179753bc3f0 124 kB/s | 3.0 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: gating-repo                                     261 kB/s | 1.5 kB     00:00
Oct 07 21:19:56 compute-0 dnf[30703]: CentOS Stream 9 - BaseOS                         60 kB/s | 6.7 kB     00:00
Oct 07 21:19:56 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 07 21:19:57 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 07 21:19:57 compute-0 systemd[1]: Reloading.
Oct 07 21:19:57 compute-0 systemd-rc-local-generator[30789]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:19:57 compute-0 dnf[30703]: CentOS Stream 9 - AppStream                      60 kB/s | 6.8 kB     00:00
Oct 07 21:19:57 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 07 21:19:57 compute-0 dnf[30703]: CentOS Stream 9 - CRB                            69 kB/s | 6.6 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: CentOS Stream 9 - Extras packages                69 kB/s | 8.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: dlrn-antelope-testing                           156 kB/s | 3.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: dlrn-antelope-build-deps                        155 kB/s | 3.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: centos9-rabbitmq                                 72 kB/s | 3.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: centos9-storage                                 123 kB/s | 3.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: centos9-opstools                                101 kB/s | 3.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: NFV SIG OpenvSwitch                              68 kB/s | 3.0 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: repo-setup-centos-appstream                     132 kB/s | 4.4 kB     00:00
Oct 07 21:19:57 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:19:57 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:19:57 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:19:57 compute-0 dnf[30703]: repo-setup-centos-baseos                        118 kB/s | 3.9 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: repo-setup-centos-highavailability              130 kB/s | 3.9 kB     00:00
Oct 07 21:19:57 compute-0 dnf[30703]: repo-setup-centos-powertools                    186 kB/s | 4.3 kB     00:00
Oct 07 21:19:58 compute-0 dnf[30703]: Extra Packages for Enterprise Linux 9 - x86_64  276 kB/s |  34 kB     00:00
Oct 07 21:19:58 compute-0 dnf[30703]: Metadata cache created.
Oct 07 21:19:58 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 07 21:19:58 compute-0 systemd[1]: Finished dnf makecache.
Oct 07 21:19:58 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.685s CPU time.
Oct 07 21:21:16 compute-0 kernel: SELinux:  Converting 2714 SID table entries...
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:21:16 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:21:16 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 07 21:21:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:21:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:21:17 compute-0 systemd[1]: Reloading.
Oct 07 21:21:17 compute-0 systemd-rc-local-generator[31126]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:21:17 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:21:17 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 07 21:21:17 compute-0 PackageKit[31462]: daemon start
Oct 07 21:21:17 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 07 21:21:17 compute-0 sudo[30499]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:18 compute-0 sudo[32045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irlviaazuumhfeietrhzpfvgseqzzgpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872078.011013-291-257383390893521/AnsiballZ_command.py'
Oct 07 21:21:18 compute-0 sudo[32045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:18 compute-0 python3.9[32047]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:21:18 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:21:18 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:21:18 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.115s CPU time.
Oct 07 21:21:18 compute-0 systemd[1]: run-r4d56312a50f54be994b795ab4d0f4438.service: Deactivated successfully.
Oct 07 21:21:20 compute-0 sudo[32045]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:21 compute-0 sudo[32328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnmnqfwylpcrdvxrtwbiqofocbvrgxri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872080.3832877-307-264671975115368/AnsiballZ_selinux.py'
Oct 07 21:21:21 compute-0 sudo[32328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:21 compute-0 python3.9[32330]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 07 21:21:21 compute-0 sudo[32328]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:22 compute-0 sudo[32480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziiwnzmlvxoasjqocgoyjmqaqvybfaue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872081.8561635-329-137388046386214/AnsiballZ_command.py'
Oct 07 21:21:22 compute-0 sudo[32480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:22 compute-0 python3.9[32482]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 07 21:21:23 compute-0 sudo[32480]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:23 compute-0 sudo[32633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcxmwsadzmjrhdclponbqfrzfsdgugkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872083.565351-345-241421200366909/AnsiballZ_file.py'
Oct 07 21:21:23 compute-0 sudo[32633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:25 compute-0 python3.9[32635]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:21:25 compute-0 sudo[32633]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:26 compute-0 sudo[32785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqfhlybrykbsmfscxxomlswcpiqqvakj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872085.715519-361-80624253148886/AnsiballZ_mount.py'
Oct 07 21:21:26 compute-0 sudo[32785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:26 compute-0 python3.9[32787]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 07 21:21:26 compute-0 sudo[32785]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:27 compute-0 sudo[32937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aboztzqbduimhkhtremmyfzsqwakyucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872087.2084644-417-11113018372082/AnsiballZ_file.py'
Oct 07 21:21:27 compute-0 sudo[32937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:27 compute-0 python3.9[32939]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:21:27 compute-0 sudo[32937]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:28 compute-0 sudo[33089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnjoyubcqxhykxvcpwpbfbtyukraewlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872087.973802-433-125101954259145/AnsiballZ_stat.py'
Oct 07 21:21:28 compute-0 sudo[33089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:28 compute-0 python3.9[33091]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:21:28 compute-0 sudo[33089]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:28 compute-0 sudo[33212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woiyyuvdtzpvdlvptgfdsnmvqzvjqbks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872087.973802-433-125101954259145/AnsiballZ_copy.py'
Oct 07 21:21:28 compute-0 sudo[33212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:32 compute-0 python3.9[33214]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872087.973802-433-125101954259145/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:21:32 compute-0 sudo[33212]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:34 compute-0 sudo[33364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtvdnqqxvqfouxscrxecuvtlsntpohed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872094.2424061-487-90749259906988/AnsiballZ_getent.py'
Oct 07 21:21:34 compute-0 sudo[33364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:34 compute-0 python3.9[33366]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 07 21:21:34 compute-0 sudo[33364]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:35 compute-0 sudo[33517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbhplcfctesuaacnefzrzxydnpzjnmzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872095.2227082-503-50426961477687/AnsiballZ_group.py'
Oct 07 21:21:35 compute-0 sudo[33517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:35 compute-0 python3.9[33519]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 07 21:21:36 compute-0 groupadd[33520]: group added to /etc/group: name=qemu, GID=107
Oct 07 21:21:36 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:21:36 compute-0 groupadd[33520]: group added to /etc/gshadow: name=qemu
Oct 07 21:21:36 compute-0 groupadd[33520]: new group: name=qemu, GID=107
Oct 07 21:21:36 compute-0 sudo[33517]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:36 compute-0 sudo[33676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijdpesiamqltngynmpqlmxsslqwsooui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872096.3728385-519-47601408879948/AnsiballZ_user.py'
Oct 07 21:21:36 compute-0 sudo[33676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:37 compute-0 python3.9[33678]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 07 21:21:37 compute-0 useradd[33680]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Oct 07 21:21:37 compute-0 sudo[33676]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:38 compute-0 sudo[33836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmyiroctehbklhbzfynmuojwdeomkbhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872097.8154085-535-174511093852944/AnsiballZ_getent.py'
Oct 07 21:21:38 compute-0 sudo[33836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:38 compute-0 python3.9[33838]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 07 21:21:38 compute-0 sudo[33836]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:38 compute-0 sudo[33989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crlmpkculpfodzkzlzczcadxxugskgtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872098.5449433-551-165959567858102/AnsiballZ_group.py'
Oct 07 21:21:38 compute-0 sudo[33989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:39 compute-0 python3.9[33991]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 07 21:21:39 compute-0 groupadd[33992]: group added to /etc/group: name=hugetlbfs, GID=42477
Oct 07 21:21:39 compute-0 groupadd[33992]: group added to /etc/gshadow: name=hugetlbfs
Oct 07 21:21:39 compute-0 groupadd[33992]: new group: name=hugetlbfs, GID=42477
Oct 07 21:21:39 compute-0 sudo[33989]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:39 compute-0 sudo[34147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsukpefiyefcvvfubjquixxwbagnhkaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872099.5473762-569-57046855712987/AnsiballZ_file.py'
Oct 07 21:21:39 compute-0 sudo[34147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:40 compute-0 python3.9[34149]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 07 21:21:40 compute-0 sudo[34147]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:40 compute-0 sudo[34299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acizamxfrenxgqektlsnpuloctbifido ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872100.4610944-591-67674715918041/AnsiballZ_dnf.py'
Oct 07 21:21:40 compute-0 sudo[34299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:41 compute-0 python3.9[34301]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:21:45 compute-0 sudo[34299]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:45 compute-0 sudo[34452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agrhklemyxcpzdmexobpelyrxfqhkuhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872105.4654849-607-31914658792824/AnsiballZ_file.py'
Oct 07 21:21:45 compute-0 sudo[34452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:46 compute-0 python3.9[34454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:21:46 compute-0 sudo[34452]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:46 compute-0 sudo[34604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uldawwosapqagywvkhgaqiprvialgytg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872106.305267-623-127736616389953/AnsiballZ_stat.py'
Oct 07 21:21:46 compute-0 sudo[34604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:46 compute-0 python3.9[34606]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:21:46 compute-0 sudo[34604]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:47 compute-0 sudo[34727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjdlxohuicxbmpjgzzindlbumvjhtomj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872106.305267-623-127736616389953/AnsiballZ_copy.py'
Oct 07 21:21:47 compute-0 sudo[34727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:47 compute-0 python3.9[34729]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872106.305267-623-127736616389953/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:21:47 compute-0 sudo[34727]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:48 compute-0 sudo[34879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghwlahqziggmttauswaegrqabdoyjeuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872107.7890103-653-178179438023799/AnsiballZ_systemd.py'
Oct 07 21:21:48 compute-0 sudo[34879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:48 compute-0 python3.9[34881]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:21:48 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 07 21:21:48 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 07 21:21:48 compute-0 kernel: Bridge firewalling registered
Oct 07 21:21:48 compute-0 systemd-modules-load[34885]: Inserted module 'br_netfilter'
Oct 07 21:21:48 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 07 21:21:48 compute-0 sudo[34879]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:49 compute-0 sudo[35038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzmclyhmcdzscjyirvovoyhkjnwewmjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872109.0820918-669-208479006522964/AnsiballZ_stat.py'
Oct 07 21:21:49 compute-0 sudo[35038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:49 compute-0 python3.9[35040]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:21:49 compute-0 sudo[35038]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:49 compute-0 sudo[35161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfzcdkrihenqryxvvgdkhdywtvsffolb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872109.0820918-669-208479006522964/AnsiballZ_copy.py'
Oct 07 21:21:49 compute-0 sudo[35161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:50 compute-0 python3.9[35163]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872109.0820918-669-208479006522964/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:21:50 compute-0 sudo[35161]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:50 compute-0 sudo[35313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxgyaomgdpofegibyctljsvopssspiot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872110.6031132-705-64600105611359/AnsiballZ_dnf.py'
Oct 07 21:21:50 compute-0 sudo[35313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:51 compute-0 python3.9[35315]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:21:54 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:21:54 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:21:54 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:21:54 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:21:54 compute-0 systemd[1]: Reloading.
Oct 07 21:21:54 compute-0 systemd-rc-local-generator[35376]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:21:54 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:21:55 compute-0 sudo[35313]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:56 compute-0 python3.9[36752]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:21:57 compute-0 python3.9[37721]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 07 21:21:57 compute-0 python3.9[38416]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:21:58 compute-0 sudo[39348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-advurxxwmdchyxoahwlljccaqbbwubdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872118.074141-783-218585222464166/AnsiballZ_command.py'
Oct 07 21:21:58 compute-0 sudo[39348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:21:58 compute-0 python3.9[39368]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:21:58 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 07 21:21:58 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:21:58 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:21:58 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.132s CPU time.
Oct 07 21:21:58 compute-0 systemd[1]: run-r27c76a3872554022975212edc5d302a9.service: Deactivated successfully.
Oct 07 21:21:59 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 07 21:21:59 compute-0 sudo[39348]: pam_unix(sudo:session): session closed for user root
Oct 07 21:21:59 compute-0 sudo[39878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysfsmkmkvbvvqkvzgqqrjgzxlfzsmmwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872119.5603445-801-45586201803958/AnsiballZ_systemd.py'
Oct 07 21:21:59 compute-0 sudo[39878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:00 compute-0 python3.9[39880]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:22:00 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 07 21:22:00 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Oct 07 21:22:00 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 07 21:22:00 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 07 21:22:00 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 07 21:22:00 compute-0 sudo[39878]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:01 compute-0 python3.9[40042]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 07 21:22:04 compute-0 sudo[40192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwxcbvocomkstwzuvrswnsrmdptmgngy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872123.9039423-915-59989214262040/AnsiballZ_systemd.py'
Oct 07 21:22:04 compute-0 sudo[40192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:04 compute-0 python3.9[40194]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:22:04 compute-0 systemd[1]: Reloading.
Oct 07 21:22:04 compute-0 systemd-rc-local-generator[40222]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:22:04 compute-0 sudo[40192]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:05 compute-0 sudo[40381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzekkqcnpuvwtlytmmvebipabxbtuisr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872125.0060964-915-222453823526546/AnsiballZ_systemd.py'
Oct 07 21:22:05 compute-0 sudo[40381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:05 compute-0 python3.9[40383]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:22:06 compute-0 systemd[1]: Reloading.
Oct 07 21:22:06 compute-0 systemd-rc-local-generator[40406]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:22:07 compute-0 sudo[40381]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:07 compute-0 sudo[40569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeaxlipspuhbvkuktbepaugkatdihvyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872127.3232832-947-21251061905798/AnsiballZ_command.py'
Oct 07 21:22:07 compute-0 sudo[40569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:07 compute-0 python3.9[40571]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:22:07 compute-0 sudo[40569]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:08 compute-0 sudo[40722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhnyosmxemkmqmxzacwfhiydersngbhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872128.079224-963-13665631642090/AnsiballZ_command.py'
Oct 07 21:22:08 compute-0 sudo[40722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:08 compute-0 python3.9[40724]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:22:08 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 07 21:22:08 compute-0 sudo[40722]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:09 compute-0 sudo[40875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waoljvjkiyqkfxktvjvgymdrmpgcruat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872128.82293-979-147982446436800/AnsiballZ_command.py'
Oct 07 21:22:09 compute-0 sudo[40875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:09 compute-0 python3.9[40877]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:22:10 compute-0 sudo[40875]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:11 compute-0 sudo[41037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjjwovadnuraudqcmxnmtspxyslkafzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872131.1141856-995-165992985809928/AnsiballZ_command.py'
Oct 07 21:22:11 compute-0 sudo[41037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:11 compute-0 python3.9[41039]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:22:11 compute-0 sudo[41037]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:12 compute-0 sudo[41190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cofjruehbobgoeaenzvldypwdwcubtbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872131.911157-1011-68997441970830/AnsiballZ_systemd.py'
Oct 07 21:22:12 compute-0 sudo[41190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:12 compute-0 python3.9[41192]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:22:12 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 07 21:22:12 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Oct 07 21:22:12 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Oct 07 21:22:12 compute-0 systemd[1]: Starting Apply Kernel Variables...
Oct 07 21:22:12 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 07 21:22:12 compute-0 systemd[1]: Finished Apply Kernel Variables.
Oct 07 21:22:12 compute-0 sudo[41190]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:13 compute-0 sshd-session[28184]: Connection closed by 192.168.122.30 port 43490
Oct 07 21:22:13 compute-0 sshd-session[28181]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:22:13 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Oct 07 21:22:13 compute-0 systemd[1]: session-10.scope: Consumed 2min 12.516s CPU time.
Oct 07 21:22:13 compute-0 systemd-logind[798]: Session 10 logged out. Waiting for processes to exit.
Oct 07 21:22:13 compute-0 systemd-logind[798]: Removed session 10.
Oct 07 21:22:13 compute-0 irqbalance[793]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 07 21:22:13 compute-0 irqbalance[793]: IRQ 26 affinity is now unmanaged
Oct 07 21:22:18 compute-0 sshd-session[41222]: Accepted publickey for zuul from 192.168.122.30 port 42300 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:22:18 compute-0 systemd-logind[798]: New session 11 of user zuul.
Oct 07 21:22:18 compute-0 systemd[1]: Started Session 11 of User zuul.
Oct 07 21:22:18 compute-0 sshd-session[41222]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:22:19 compute-0 python3.9[41375]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:22:20 compute-0 python3.9[41529]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:22:21 compute-0 sudo[41683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dccoopptpfsgdnpyvtnzlochijixrhlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872141.2263315-80-55186755589359/AnsiballZ_command.py'
Oct 07 21:22:21 compute-0 sudo[41683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:21 compute-0 python3.9[41685]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:22:21 compute-0 sudo[41683]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:23 compute-0 python3.9[41836]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:22:23 compute-0 sudo[41990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfbsxehnnlbqsokpbdxmymdnbjqguwoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872143.4201512-120-118414369668252/AnsiballZ_setup.py'
Oct 07 21:22:23 compute-0 sudo[41990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:24 compute-0 python3.9[41992]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:22:24 compute-0 sudo[41990]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:24 compute-0 sudo[42074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npyaoyhpxnnlxmbqypytgwxzaeqlberc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872143.4201512-120-118414369668252/AnsiballZ_dnf.py'
Oct 07 21:22:24 compute-0 sudo[42074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:24 compute-0 python3.9[42076]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:22:26 compute-0 sudo[42074]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:26 compute-0 sudo[42227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcvaxpvhfluclhlnuccgmsxmyjrixyeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872146.3366601-144-162725791370648/AnsiballZ_setup.py'
Oct 07 21:22:26 compute-0 sudo[42227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:26 compute-0 python3.9[42229]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:22:27 compute-0 sudo[42227]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:27 compute-0 sudo[42398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imszmtvgmlnmheekqnruuysrcxcqgfar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872147.431998-166-28621700862649/AnsiballZ_file.py'
Oct 07 21:22:27 compute-0 sudo[42398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:28 compute-0 python3.9[42400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:22:28 compute-0 sudo[42398]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:28 compute-0 sudo[42550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkkxtarnzdwbhbapwmbvnjwngzaiiujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872148.3525407-182-198715086162021/AnsiballZ_command.py'
Oct 07 21:22:28 compute-0 sudo[42550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:28 compute-0 python3.9[42552]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:22:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat523234552-merged.mount: Deactivated successfully.
Oct 07 21:22:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1448841276-merged.mount: Deactivated successfully.
Oct 07 21:22:28 compute-0 podman[42553]: 2025-10-07 21:22:28.912332505 +0000 UTC m=+0.057269284 system refresh
Oct 07 21:22:28 compute-0 sudo[42550]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:29 compute-0 sudo[42712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxvdilxdpohleywjynhmyskrspzzhkjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872149.2778993-198-102770054339581/AnsiballZ_stat.py'
Oct 07 21:22:29 compute-0 sudo[42712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:22:29 compute-0 python3.9[42714]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:22:29 compute-0 sudo[42712]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:30 compute-0 sudo[42835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siwauuevxozuuwuodvamisqwdmrgnxyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872149.2778993-198-102770054339581/AnsiballZ_copy.py'
Oct 07 21:22:30 compute-0 sudo[42835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:30 compute-0 python3.9[42837]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872149.2778993-198-102770054339581/.source.json follow=False _original_basename=podman_network_config.j2 checksum=00ef1b3955cffd9d73a27b920e4f78a6e6015b05 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:22:30 compute-0 sudo[42835]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:31 compute-0 sudo[42987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bswakpxfbitykubcxejoqlncgevtwhah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872150.8100994-228-81507958546870/AnsiballZ_stat.py'
Oct 07 21:22:31 compute-0 sudo[42987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:31 compute-0 python3.9[42989]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:22:31 compute-0 sudo[42987]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:31 compute-0 sudo[43110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viyhjfgfvdhivuyugmhbbkxixmunagwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872150.8100994-228-81507958546870/AnsiballZ_copy.py'
Oct 07 21:22:31 compute-0 sudo[43110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:32 compute-0 python3.9[43112]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872150.8100994-228-81507958546870/.source.conf follow=False _original_basename=registries.conf.j2 checksum=9a99cf1ed3988236ece9a89cc458342592ec4bd9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:22:32 compute-0 sudo[43110]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:32 compute-0 sudo[43262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrusfdqentwmnesbhhndqtilqzaweglb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872152.2763584-260-247536890863284/AnsiballZ_ini_file.py'
Oct 07 21:22:32 compute-0 sudo[43262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:32 compute-0 python3.9[43264]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:22:32 compute-0 sudo[43262]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:33 compute-0 sudo[43414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blyjxradocasggijkdtcsqpoaqpkscic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872153.2019114-260-34324441115511/AnsiballZ_ini_file.py'
Oct 07 21:22:33 compute-0 sudo[43414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:33 compute-0 python3.9[43416]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:22:33 compute-0 sudo[43414]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:34 compute-0 sudo[43566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgvuxgilffqruyltqswibwodrvtfncor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872153.873376-260-235183154362692/AnsiballZ_ini_file.py'
Oct 07 21:22:34 compute-0 sudo[43566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:34 compute-0 python3.9[43568]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:22:34 compute-0 sudo[43566]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:34 compute-0 sudo[43718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiyeptpkrapaajcwowuaryoukzhmdikk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872154.561593-260-267071203629916/AnsiballZ_ini_file.py'
Oct 07 21:22:34 compute-0 sudo[43718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:35 compute-0 python3.9[43720]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:22:35 compute-0 sudo[43718]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:36 compute-0 python3.9[43870]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:22:36 compute-0 sudo[44022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzwmqczbvvxmadtqclmutdpaujwohfie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872156.2669768-340-248623836209961/AnsiballZ_dnf.py'
Oct 07 21:22:36 compute-0 sudo[44022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:36 compute-0 python3.9[44024]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:38 compute-0 sudo[44022]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:38 compute-0 sudo[44175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjujcivlserkkssztqqotvruffnbxdbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872158.2649708-356-252748854840680/AnsiballZ_dnf.py'
Oct 07 21:22:38 compute-0 sudo[44175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:38 compute-0 python3.9[44177]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:40 compute-0 sudo[44175]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:41 compute-0 sudo[44335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hndonizvtdzohxqfopavjqugkiifxbib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872160.729093-376-148181712362644/AnsiballZ_dnf.py'
Oct 07 21:22:41 compute-0 sudo[44335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:41 compute-0 python3.9[44337]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:42 compute-0 sudo[44335]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:43 compute-0 sudo[44488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbivpfmyycwwdgzdqhgrwzizmjzwuftf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872162.7646008-394-48517611814060/AnsiballZ_dnf.py'
Oct 07 21:22:43 compute-0 sudo[44488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:43 compute-0 python3.9[44490]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:44 compute-0 sudo[44488]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:45 compute-0 sudo[44641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efltcyjzuebddiypzdyamratctnryjdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872165.0958948-416-221812286248025/AnsiballZ_dnf.py'
Oct 07 21:22:45 compute-0 sudo[44641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:45 compute-0 python3.9[44643]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:46 compute-0 sudo[44641]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:47 compute-0 sudo[44797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orqprfybscemzzismxwwgcrhunggrrnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872167.29705-432-263033452237348/AnsiballZ_dnf.py'
Oct 07 21:22:47 compute-0 sudo[44797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:47 compute-0 python3.9[44799]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:51 compute-0 sudo[44797]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:51 compute-0 sudo[44965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnxtfwwrzfaxcrlkxzynxxspwsrmnafa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872171.3034003-450-68595461406634/AnsiballZ_dnf.py'
Oct 07 21:22:51 compute-0 sudo[44965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:51 compute-0 python3.9[44967]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:22:52 compute-0 sudo[44965]: pam_unix(sudo:session): session closed for user root
Oct 07 21:22:53 compute-0 sudo[45118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayxmmnvqzsspbopefltaexegmwupcrvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872173.259132-468-182392129266933/AnsiballZ_dnf.py'
Oct 07 21:22:53 compute-0 sudo[45118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:22:53 compute-0 python3.9[45120]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:23:02 compute-0 sshd-session[45132]: Invalid user bitwarden from 103.115.24.11 port 39360
Oct 07 21:23:02 compute-0 sshd-session[45132]: Received disconnect from 103.115.24.11 port 39360:11: Bye Bye [preauth]
Oct 07 21:23:02 compute-0 sshd-session[45132]: Disconnected from invalid user bitwarden 103.115.24.11 port 39360 [preauth]
Oct 07 21:23:07 compute-0 sudo[45118]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:07 compute-0 sudo[45457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsnbialcryjjeautqlwhlnnmfqbdlkvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872187.3905365-490-248909298861343/AnsiballZ_file.py'
Oct 07 21:23:07 compute-0 sudo[45457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:07 compute-0 python3.9[45459]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:23:07 compute-0 sudo[45457]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:08 compute-0 sudo[45632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owbwojkwllfyxyegwykbedygwpdutlmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872188.079147-506-70555474004309/AnsiballZ_stat.py'
Oct 07 21:23:08 compute-0 sudo[45632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:08 compute-0 python3.9[45634]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:23:08 compute-0 sudo[45632]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:09 compute-0 sudo[45755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gslorjrfakobfzsemevcjwxmgfmqtadf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872188.079147-506-70555474004309/AnsiballZ_copy.py'
Oct 07 21:23:09 compute-0 sudo[45755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:09 compute-0 python3.9[45757]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759872188.079147-506-70555474004309/.source.json _original_basename=.590yxrz4 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:23:09 compute-0 sudo[45755]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:09 compute-0 sudo[45907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udgekgueydnnovilrdxcdcxkiemkyzag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872189.5500834-542-14806603979734/AnsiballZ_podman_image.py'
Oct 07 21:23:09 compute-0 sudo[45907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:10 compute-0 python3.9[45909]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:23:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3806112760-lower\x2dmapped.mount: Deactivated successfully.
Oct 07 21:23:17 compute-0 podman[45922]: 2025-10-07 21:23:17.362616906 +0000 UTC m=+7.067676439 image pull 2c0acbe8b07baed3b27d0202cd594c4edfd15616d3c28ad8374e80ebca74a2a1 38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 07 21:23:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:17 compute-0 sudo[45907]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:18 compute-0 sudo[46219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmrkicmjuntcaphruhncnmbppdjcdtui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872197.9016736-560-154677305409731/AnsiballZ_podman_image.py'
Oct 07 21:23:18 compute-0 sudo[46219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:18 compute-0 python3.9[46221]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:23:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:19 compute-0 podman[46233]: 2025-10-07 21:23:19.896069813 +0000 UTC m=+1.431747640 image pull 3f0eba8665aff2d8053ef7db64bd77093affef7b4125d116bb9a11adf927b8d7 38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 07 21:23:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:20 compute-0 sudo[46219]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:20 compute-0 sudo[46488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jstlvumvlugzddhlmmtknbowtsgtkjem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872200.4197676-582-221038040717794/AnsiballZ_podman_image.py'
Oct 07 21:23:20 compute-0 sudo[46488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:20 compute-0 python3.9[46490]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:23:27 compute-0 podman[46503]: 2025-10-07 21:23:27.737712198 +0000 UTC m=+6.843632022 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 21:23:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:28 compute-0 sudo[46488]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:28 compute-0 sudo[46782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmrsroluwjdzcjakzjcyibmhxmrpbpre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872208.3769803-602-124436031541326/AnsiballZ_podman_image.py'
Oct 07 21:23:28 compute-0 sudo[46782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:28 compute-0 python3.9[46784]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:23:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:29 compute-0 podman[46795]: 2025-10-07 21:23:29.289760317 +0000 UTC m=+0.352455264 image pull 4681127ca41b9c0ad73cf128c4c3175cc608608dca0d6e6910829324a5619ecd 38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 07 21:23:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:29 compute-0 sudo[46782]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:30 compute-0 sudo[47028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmrcxcehtzsfxvzikctgcjthqvwlfoxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872209.7683866-620-13003969835456/AnsiballZ_podman_image.py'
Oct 07 21:23:30 compute-0 sudo[47028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:30 compute-0 python3.9[47030]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:23:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:40 compute-0 podman[47042]: 2025-10-07 21:23:40.122697535 +0000 UTC m=+9.704107411 image pull e40a82bb5768ddbb81728291deee4da0629f3c0ac149f80011e3a69d48a891a3 38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 07 21:23:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:40 compute-0 sudo[47028]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:55 compute-0 sudo[47296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emfgfvpfkccmloyhphrjvjztrnkusbrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872235.0720077-642-12720254734194/AnsiballZ_podman_image.py'
Oct 07 21:23:55 compute-0 sudo[47296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:55 compute-0 python3.9[47298]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.12:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:23:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:58 compute-0 podman[47310]: 2025-10-07 21:23:58.604649894 +0000 UTC m=+2.953092308 image pull cd906edb4943770d89c246447130754e53fa47860a119febfa4c20f4542a888b 38.102.83.12:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Oct 07 21:23:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:23:58 compute-0 sudo[47296]: pam_unix(sudo:session): session closed for user root
Oct 07 21:23:59 compute-0 sudo[47562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdbijwexvtndxvcliiemglojmxgaigmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872238.9454691-642-139937567281469/AnsiballZ_podman_image.py'
Oct 07 21:23:59 compute-0 sudo[47562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:23:59 compute-0 python3.9[47564]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 07 21:24:00 compute-0 podman[47576]: 2025-10-07 21:24:00.688843942 +0000 UTC m=+1.153840066 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct 07 21:24:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:24:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:24:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:24:00 compute-0 sudo[47562]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:03 compute-0 sshd-session[41225]: Connection closed by 192.168.122.30 port 42300
Oct 07 21:24:03 compute-0 sshd-session[41222]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:24:03 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Oct 07 21:24:03 compute-0 systemd[1]: session-11.scope: Consumed 1min 48.732s CPU time.
Oct 07 21:24:03 compute-0 systemd-logind[798]: Session 11 logged out. Waiting for processes to exit.
Oct 07 21:24:03 compute-0 systemd-logind[798]: Removed session 11.
Oct 07 21:24:08 compute-0 sshd-session[47722]: Accepted publickey for zuul from 192.168.122.30 port 56414 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:24:08 compute-0 systemd-logind[798]: New session 12 of user zuul.
Oct 07 21:24:08 compute-0 systemd[1]: Started Session 12 of User zuul.
Oct 07 21:24:08 compute-0 sshd-session[47722]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:24:09 compute-0 python3.9[47875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:24:10 compute-0 sudo[48029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htsuiduaglbuirucfcgoxczshisxgfgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872250.4580534-52-82206985271829/AnsiballZ_getent.py'
Oct 07 21:24:10 compute-0 sudo[48029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:11 compute-0 python3.9[48031]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 07 21:24:11 compute-0 sudo[48029]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:11 compute-0 sudo[48182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhzurzaryzifmhvabphihntqilnkzbru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872251.3934822-68-135140634800619/AnsiballZ_group.py'
Oct 07 21:24:11 compute-0 sudo[48182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:12 compute-0 python3.9[48184]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 07 21:24:12 compute-0 groupadd[48185]: group added to /etc/group: name=openvswitch, GID=42476
Oct 07 21:24:12 compute-0 groupadd[48185]: group added to /etc/gshadow: name=openvswitch
Oct 07 21:24:12 compute-0 groupadd[48185]: new group: name=openvswitch, GID=42476
Oct 07 21:24:12 compute-0 sudo[48182]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:12 compute-0 sudo[48340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lljkozkwkgaojqyygpgvvsujbbxcejfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872252.3526356-84-159455128502056/AnsiballZ_user.py'
Oct 07 21:24:12 compute-0 sudo[48340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:13 compute-0 python3.9[48342]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 07 21:24:13 compute-0 useradd[48344]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Oct 07 21:24:13 compute-0 useradd[48344]: add 'openvswitch' to group 'hugetlbfs'
Oct 07 21:24:13 compute-0 useradd[48344]: add 'openvswitch' to shadow group 'hugetlbfs'
Oct 07 21:24:13 compute-0 sudo[48340]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:13 compute-0 sudo[48500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqcoongqcsaumwvtctgaxntylrymqkog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872253.5165927-104-250663492471393/AnsiballZ_setup.py'
Oct 07 21:24:13 compute-0 sudo[48500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:14 compute-0 python3.9[48502]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:24:14 compute-0 sudo[48500]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:14 compute-0 sudo[48584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzwkugbecnyyjkecgluhiwfdxrdvsysr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872253.5165927-104-250663492471393/AnsiballZ_dnf.py'
Oct 07 21:24:14 compute-0 sudo[48584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:14 compute-0 python3.9[48586]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:24:16 compute-0 sudo[48584]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:17 compute-0 sudo[48745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygznvjbtsdhjrqgnsbiaxuokztxsdhki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872257.0656946-132-156232402537560/AnsiballZ_dnf.py'
Oct 07 21:24:17 compute-0 sudo[48745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:17 compute-0 python3.9[48747]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:24:28 compute-0 kernel: SELinux:  Converting 2726 SID table entries...
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:24:28 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:24:29 compute-0 groupadd[48770]: group added to /etc/group: name=unbound, GID=993
Oct 07 21:24:29 compute-0 groupadd[48770]: group added to /etc/gshadow: name=unbound
Oct 07 21:24:29 compute-0 groupadd[48770]: new group: name=unbound, GID=993
Oct 07 21:24:29 compute-0 useradd[48777]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Oct 07 21:24:29 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 07 21:24:29 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 07 21:24:30 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:24:30 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:24:30 compute-0 systemd[1]: Reloading.
Oct 07 21:24:31 compute-0 systemd-rc-local-generator[49275]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:24:31 compute-0 systemd-sysv-generator[49278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:24:31 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:24:31 compute-0 sudo[48745]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:24:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:24:31 compute-0 systemd[1]: run-r7ba719c496fb4fce88007dde2cbb670e.service: Deactivated successfully.
Oct 07 21:24:32 compute-0 sudo[49849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdwtybfvhhsjynznwnopdimvotlsargt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872271.9239204-148-261326868050548/AnsiballZ_systemd.py'
Oct 07 21:24:32 compute-0 sudo[49849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:32 compute-0 python3.9[49851]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:24:33 compute-0 systemd[1]: Reloading.
Oct 07 21:24:33 compute-0 systemd-sysv-generator[49882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:24:33 compute-0 systemd-rc-local-generator[49877]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:24:34 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Oct 07 21:24:34 compute-0 chown[49893]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 07 21:24:34 compute-0 ovs-ctl[49898]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 07 21:24:34 compute-0 ovs-ctl[49898]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 07 21:24:34 compute-0 ovs-ctl[49898]: Starting ovsdb-server [  OK  ]
Oct 07 21:24:34 compute-0 ovs-vsctl[49947]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 07 21:24:34 compute-0 ovs-vsctl[49963]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"dca786dc-b408-4181-8e47-0e14c60f13da\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 07 21:24:34 compute-0 ovs-ctl[49898]: Configuring Open vSwitch system IDs [  OK  ]
Oct 07 21:24:34 compute-0 ovs-ctl[49898]: Enabling remote OVSDB managers [  OK  ]
Oct 07 21:24:34 compute-0 ovs-vsctl[49973]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 07 21:24:34 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Oct 07 21:24:34 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 07 21:24:34 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 07 21:24:34 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 07 21:24:34 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Oct 07 21:24:34 compute-0 ovs-ctl[50018]: Inserting openvswitch module [  OK  ]
Oct 07 21:24:34 compute-0 ovs-ctl[49987]: Starting ovs-vswitchd [  OK  ]
Oct 07 21:24:34 compute-0 ovs-ctl[49987]: Enabling remote OVSDB managers [  OK  ]
Oct 07 21:24:34 compute-0 ovs-vsctl[50035]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 07 21:24:34 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 07 21:24:34 compute-0 systemd[1]: Starting Open vSwitch...
Oct 07 21:24:34 compute-0 systemd[1]: Finished Open vSwitch.
Oct 07 21:24:34 compute-0 sudo[49849]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:35 compute-0 python3.9[50187]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:24:36 compute-0 sudo[50337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksabsxzuaxfdjmlpzjwtwcgzidmouvln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872276.078363-184-74513168174174/AnsiballZ_sefcontext.py'
Oct 07 21:24:36 compute-0 sudo[50337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:36 compute-0 python3.9[50339]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 07 21:24:37 compute-0 kernel: SELinux:  Converting 2740 SID table entries...
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:24:37 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:24:38 compute-0 sudo[50337]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:38 compute-0 python3.9[50494]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:24:39 compute-0 sudo[50650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezdgqlytltrrxtirblmfcdtdkphedpin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872279.342341-220-175530472911245/AnsiballZ_dnf.py'
Oct 07 21:24:39 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 07 21:24:39 compute-0 sudo[50650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:39 compute-0 python3.9[50652]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:24:40 compute-0 sudo[50650]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:41 compute-0 sudo[50803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdbtvikwlokarvjiiqjnvobfnxxdprwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872281.1411097-236-162707174444386/AnsiballZ_command.py'
Oct 07 21:24:41 compute-0 sudo[50803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:41 compute-0 python3.9[50805]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:24:42 compute-0 sudo[50803]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:43 compute-0 sudo[51090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmiruodowqdijphwjbhvipdgouvyrhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872282.6815655-252-156136511473384/AnsiballZ_file.py'
Oct 07 21:24:43 compute-0 sudo[51090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:43 compute-0 python3.9[51092]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 07 21:24:43 compute-0 sudo[51090]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:44 compute-0 python3.9[51242]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:24:44 compute-0 sudo[51394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgpisvbjmxgfbujhyxmsbundcdhdlmuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872284.412594-284-18308054018046/AnsiballZ_dnf.py'
Oct 07 21:24:44 compute-0 sudo[51394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:45 compute-0 python3.9[51396]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:24:48 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:24:48 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:24:48 compute-0 systemd[1]: Reloading.
Oct 07 21:24:48 compute-0 systemd-sysv-generator[51436]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:24:48 compute-0 systemd-rc-local-generator[51430]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:24:48 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:24:48 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:24:48 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:24:48 compute-0 systemd[1]: run-rf094241a9497491b9e6af9f596671061.service: Deactivated successfully.
Oct 07 21:24:48 compute-0 sudo[51394]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:49 compute-0 sudo[51711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cadztenxjpojmekbvraynoxdlwvneqvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872288.8691545-300-170578326618757/AnsiballZ_systemd.py'
Oct 07 21:24:49 compute-0 sudo[51711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:49 compute-0 python3.9[51713]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:24:49 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 07 21:24:49 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Oct 07 21:24:49 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Oct 07 21:24:49 compute-0 systemd[1]: Stopping Network Manager...
Oct 07 21:24:49 compute-0 NetworkManager[3945]: <info>  [1759872289.5956] caught SIGTERM, shutting down normally.
Oct 07 21:24:49 compute-0 NetworkManager[3945]: <info>  [1759872289.5983] dhcp4 (eth0): canceled DHCP transaction
Oct 07 21:24:49 compute-0 NetworkManager[3945]: <info>  [1759872289.5984] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 07 21:24:49 compute-0 NetworkManager[3945]: <info>  [1759872289.5984] dhcp4 (eth0): state changed no lease
Oct 07 21:24:49 compute-0 NetworkManager[3945]: <info>  [1759872289.5987] manager: NetworkManager state is now CONNECTED_SITE
Oct 07 21:24:49 compute-0 NetworkManager[3945]: <info>  [1759872289.6082] exiting (success)
Oct 07 21:24:49 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 21:24:49 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 21:24:49 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 07 21:24:49 compute-0 systemd[1]: Stopped Network Manager.
Oct 07 21:24:49 compute-0 systemd[1]: NetworkManager.service: Consumed 10.395s CPU time, 4.1M memory peak, read 0B from disk, written 35.5K to disk.
Oct 07 21:24:49 compute-0 systemd[1]: Starting Network Manager...
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.7117] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:7431bc2e-d322-496b-a062-a84e4f20d15f)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.7119] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.7200] manager[0x556be5eee090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 07 21:24:49 compute-0 systemd[1]: Starting Hostname Service...
Oct 07 21:24:49 compute-0 systemd[1]: Started Hostname Service.
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8318] hostname: hostname: using hostnamed
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8318] hostname: static hostname changed from (none) to "compute-0"
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8322] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8327] manager[0x556be5eee090]: rfkill: Wi-Fi hardware radio set enabled
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8327] manager[0x556be5eee090]: rfkill: WWAN hardware radio set enabled
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8347] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8354] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8355] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8355] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8356] manager: Networking is enabled by state file
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8358] settings: Loaded settings plugin: keyfile (internal)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8361] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8389] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8401] dhcp: init: Using DHCP client 'internal'
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8403] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8409] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8414] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8424] device (lo): Activation: starting connection 'lo' (1bf17f26-0348-44a6-aa6b-b5a51eaf7edf)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8432] device (eth0): carrier: link connected
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8436] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8441] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8442] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8449] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8457] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8464] device (eth1): carrier: link connected
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8467] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8472] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5) (indicated)
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8473] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8478] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8486] device (eth1): Activation: starting connection 'ci-private-network' (6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5)
Oct 07 21:24:49 compute-0 systemd[1]: Started Network Manager.
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8512] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8532] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8538] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8542] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8547] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8553] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8560] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8566] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8573] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8586] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8592] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 07 21:24:49 compute-0 systemd[1]: Starting Network Manager Wait Online...
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8630] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8648] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8657] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8660] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8666] device (lo): Activation: successful, device activated.
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8674] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8676] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8679] manager: NetworkManager state is now CONNECTED_LOCAL
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8682] device (eth1): Activation: successful, device activated.
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8691] dhcp4 (eth0): state changed new lease, address=38.102.83.103
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8700] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8770] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8824] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8827] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8832] manager: NetworkManager state is now CONNECTED_SITE
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8838] device (eth0): Activation: successful, device activated.
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8846] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 07 21:24:49 compute-0 NetworkManager[51722]: <info>  [1759872289.8850] manager: startup complete
Oct 07 21:24:49 compute-0 sudo[51711]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:49 compute-0 systemd[1]: Finished Network Manager Wait Online.
Oct 07 21:24:50 compute-0 sudo[51937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcrtogrkbohpjninzjzbyeikemdrfiza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872290.0849757-316-70811860677374/AnsiballZ_dnf.py'
Oct 07 21:24:50 compute-0 sudo[51937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:50 compute-0 python3.9[51939]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:24:56 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:24:56 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:24:56 compute-0 systemd[1]: Reloading.
Oct 07 21:24:56 compute-0 systemd-rc-local-generator[51989]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:24:56 compute-0 systemd-sysv-generator[51994]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:24:56 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:24:57 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:24:57 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:24:57 compute-0 systemd[1]: run-rab4ee8fcccf54a6084534298774eb898.service: Deactivated successfully.
Oct 07 21:24:57 compute-0 sudo[51937]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:58 compute-0 sudo[52399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfwdhslinwvcngzaqycchbwxatpaiekq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872297.8681247-340-79357725659123/AnsiballZ_stat.py'
Oct 07 21:24:58 compute-0 sudo[52399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:58 compute-0 python3.9[52401]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:24:58 compute-0 sudo[52399]: pam_unix(sudo:session): session closed for user root
Oct 07 21:24:59 compute-0 sudo[52551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sepvxeclvjoofhzoqauodocsnswkthjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872298.7022758-358-92118098794794/AnsiballZ_ini_file.py'
Oct 07 21:24:59 compute-0 sudo[52551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:24:59 compute-0 python3.9[52553]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:24:59 compute-0 sudo[52551]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 21:25:00 compute-0 sudo[52705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qetoskkxgqkjfqfnbdxsrucitmfthdgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872299.8769023-378-92644249573952/AnsiballZ_ini_file.py'
Oct 07 21:25:00 compute-0 sudo[52705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:00 compute-0 python3.9[52707]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:00 compute-0 sudo[52705]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:00 compute-0 sudo[52857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hahilkyeubwpwrxrmleicvxwaavqamul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872300.6402829-378-233498876161240/AnsiballZ_ini_file.py'
Oct 07 21:25:00 compute-0 sudo[52857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:01 compute-0 python3.9[52859]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:01 compute-0 sudo[52857]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:01 compute-0 sudo[53009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzsnvxyiqfdywaooacwsflzroiwleuhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872301.3496869-408-15496088915684/AnsiballZ_ini_file.py'
Oct 07 21:25:01 compute-0 sudo[53009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:01 compute-0 python3.9[53011]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:01 compute-0 sudo[53009]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:02 compute-0 sudo[53161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhhfnaozrbooraoibmnszdlexagakhtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872302.020171-408-204726257258510/AnsiballZ_ini_file.py'
Oct 07 21:25:02 compute-0 sudo[53161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:02 compute-0 python3.9[53163]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:02 compute-0 sudo[53161]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:03 compute-0 sudo[53313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swdivautldjuicghhsjyqdnzjtmszdjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872302.770882-438-96617582114720/AnsiballZ_stat.py'
Oct 07 21:25:03 compute-0 sudo[53313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:03 compute-0 python3.9[53315]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:25:03 compute-0 sudo[53313]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:03 compute-0 sudo[53436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztojtmbvyqjderkrrwltcerxzczfisng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872302.770882-438-96617582114720/AnsiballZ_copy.py'
Oct 07 21:25:03 compute-0 sudo[53436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:03 compute-0 python3.9[53438]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872302.770882-438-96617582114720/.source _original_basename=.1gn92w27 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:03 compute-0 sudo[53436]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:04 compute-0 sudo[53588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjxyozdzgdgwlxtyqxbhppmwsfprrqla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872304.1265426-468-191092951095988/AnsiballZ_file.py'
Oct 07 21:25:04 compute-0 sudo[53588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:04 compute-0 python3.9[53590]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:04 compute-0 sudo[53588]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:05 compute-0 sudo[53740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wurgcebfidzimjenqyfbpcfsdlofyzmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872304.8965228-484-183175494861317/AnsiballZ_edpm_os_net_config_mappings.py'
Oct 07 21:25:05 compute-0 sudo[53740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:05 compute-0 python3.9[53742]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 07 21:25:05 compute-0 sudo[53740]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:06 compute-0 sudo[53892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ranlhvtzukocgwnwltdyjwyzunkolhhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872305.8879614-502-175263088157127/AnsiballZ_file.py'
Oct 07 21:25:06 compute-0 sudo[53892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:06 compute-0 python3.9[53894]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:06 compute-0 sudo[53892]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:07 compute-0 sudo[54044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiqxskudjrjcchoaviigcshppknbcmxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872306.7957246-522-43628723179687/AnsiballZ_stat.py'
Oct 07 21:25:07 compute-0 sudo[54044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:07 compute-0 sudo[54044]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:07 compute-0 sudo[54167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwcfzjwltaxfzjkjbfvxnvmrtvkqkskp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872306.7957246-522-43628723179687/AnsiballZ_copy.py'
Oct 07 21:25:07 compute-0 sudo[54167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:07 compute-0 sudo[54167]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:08 compute-0 sudo[54319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbogwpjjyxcgmckffhzzbphhacdxmfur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872308.1795704-552-164749255787557/AnsiballZ_slurp.py'
Oct 07 21:25:08 compute-0 sudo[54319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:08 compute-0 python3.9[54321]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 07 21:25:08 compute-0 sudo[54319]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:10 compute-0 sudo[54494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rufzbaphbotadsiamkfljlixorkieinp ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872309.2211347-570-247954795427965/async_wrapper.py j814274977975 300 /home/zuul/.ansible/tmp/ansible-tmp-1759872309.2211347-570-247954795427965/AnsiballZ_edpm_os_net_config.py _'
Oct 07 21:25:10 compute-0 sudo[54494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:10 compute-0 ansible-async_wrapper.py[54496]: Invoked with j814274977975 300 /home/zuul/.ansible/tmp/ansible-tmp-1759872309.2211347-570-247954795427965/AnsiballZ_edpm_os_net_config.py _
Oct 07 21:25:10 compute-0 ansible-async_wrapper.py[54499]: Starting module and watcher
Oct 07 21:25:10 compute-0 ansible-async_wrapper.py[54499]: Start watching 54500 (300)
Oct 07 21:25:10 compute-0 ansible-async_wrapper.py[54500]: Start module (54500)
Oct 07 21:25:10 compute-0 ansible-async_wrapper.py[54496]: Return async_wrapper task started.
Oct 07 21:25:10 compute-0 sudo[54494]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:10 compute-0 python3.9[54501]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct 07 21:25:11 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 07 21:25:11 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 07 21:25:11 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 07 21:25:11 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 07 21:25:11 compute-0 kernel: cfg80211: failed to load regulatory.db
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0161] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0181] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0810] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0813] audit: op="connection-add" uuid="3415e97a-b6a3-479f-bc7c-957211543049" name="br-ex-br" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0839] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0842] audit: op="connection-add" uuid="ac5e26f9-e6fd-4cd6-9c3e-80bc3129b220" name="br-ex-port" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0865] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0867] audit: op="connection-add" uuid="9337e1b3-acd9-471b-b2da-6f3736642253" name="eth1-port" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0891] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0894] audit: op="connection-add" uuid="5cd30ea6-eade-4368-bd28-8555ed006282" name="vlan20-port" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0916] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0920] audit: op="connection-add" uuid="90e7806e-4a28-410e-86ae-d0a3e97e423e" name="vlan21-port" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0942] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0945] audit: op="connection-add" uuid="84301ebd-3ca1-4219-abc7-4edb776a5539" name="vlan22-port" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.0982] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1012] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1015] audit: op="connection-add" uuid="b0c55d9e-ae4e-44a0-af95-d566c5556e08" name="br-ex-if" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1069] audit: op="connection-update" uuid="6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5" name="ci-private-network" args="connection.slave-type,connection.port-type,connection.controller,connection.timestamp,connection.master,ovs-interface.type,ipv4.method,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv4.dns,ipv4.never-default,ipv6.routing-rules,ipv6.method,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routes,ipv6.dns,ovs-external-ids.data" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1096] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1099] audit: op="connection-add" uuid="030815e1-7a17-4c3c-be7c-8112f18eae70" name="vlan20-if" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1126] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1128] audit: op="connection-add" uuid="143a08c5-364c-41ea-be69-f7b9fd5495d1" name="vlan21-if" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1152] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1155] audit: op="connection-add" uuid="86a820da-f465-4df9-8832-88f71fee1356" name="vlan22-if" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1171] audit: op="connection-delete" uuid="dba2d05c-583f-3e37-8712-a9017933dd6a" name="Wired connection 1" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1188] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1204] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1210] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (3415e97a-b6a3-479f-bc7c-957211543049)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1211] audit: op="connection-activate" uuid="3415e97a-b6a3-479f-bc7c-957211543049" name="br-ex-br" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1213] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1222] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1229] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (ac5e26f9-e6fd-4cd6-9c3e-80bc3129b220)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1232] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1241] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1247] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (9337e1b3-acd9-471b-b2da-6f3736642253)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1250] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1260] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1267] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (5cd30ea6-eade-4368-bd28-8555ed006282)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1270] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1280] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1286] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (90e7806e-4a28-410e-86ae-d0a3e97e423e)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1291] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1302] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1309] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (84301ebd-3ca1-4219-abc7-4edb776a5539)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1310] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1314] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1317] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1326] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1331] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1336] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (b0c55d9e-ae4e-44a0-af95-d566c5556e08)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1336] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1340] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1342] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1343] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1344] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1358] device (eth1): disconnecting for new activation request.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1359] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1362] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1364] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1365] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1367] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1373] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1376] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (030815e1-7a17-4c3c-be7c-8112f18eae70)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1377] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1380] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1381] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1383] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1386] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1391] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1396] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (143a08c5-364c-41ea-be69-f7b9fd5495d1)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1397] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1400] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1402] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1403] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1406] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1411] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1414] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (86a820da-f465-4df9-8832-88f71fee1356)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1415] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1418] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1419] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1421] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1423] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1438] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1440] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1444] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1446] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1453] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1456] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1460] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1463] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1465] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 kernel: ovs-system: entered promiscuous mode
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1472] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1478] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 kernel: Timeout policy base is empty
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1483] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1485] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1492] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1498] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1502] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1504] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1510] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1515] dhcp4 (eth0): canceled DHCP transaction
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1515] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1515] dhcp4 (eth0): state changed no lease
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1517] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 07 21:25:12 compute-0 systemd-udevd[54508]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1531] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1537] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54502 uid=0 result="fail" reason="Device is not activated"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1544] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 07 21:25:12 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1588] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1592] dhcp4 (eth0): state changed new lease, address=38.102.83.103
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1646] device (eth1): disconnecting for new activation request.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1647] audit: op="connection-activate" uuid="6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5" name="ci-private-network" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1649] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 07 21:25:12 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1711] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54502 uid=0 result="success"
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1712] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 07 21:25:12 compute-0 kernel: br-ex: entered promiscuous mode
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1889] device (eth1): Activation: starting connection 'ci-private-network' (6ff53518-8ab0-58fa-aaf9-1d4f04c1efd5)
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1894] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1901] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1904] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1910] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1916] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1925] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1927] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1929] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1931] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1932] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1935] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 kernel: vlan22: entered promiscuous mode
Oct 07 21:25:12 compute-0 systemd-udevd[54506]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1955] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1968] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1973] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1977] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1981] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1986] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1992] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.1997] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2003] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2007] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2015] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 07 21:25:12 compute-0 kernel: vlan20: entered promiscuous mode
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2032] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2048] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2067] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2073] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2079] device (eth1): Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2098] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 kernel: vlan21: entered promiscuous mode
Oct 07 21:25:12 compute-0 systemd-udevd[54507]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2127] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2138] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2145] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2152] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2182] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2186] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2211] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2217] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2230] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2237] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2252] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2253] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2258] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2264] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2278] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2310] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2311] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 07 21:25:12 compute-0 NetworkManager[51722]: <info>  [1759872312.2317] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 07 21:25:13 compute-0 NetworkManager[51722]: <info>  [1759872313.3504] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54502 uid=0 result="success"
Oct 07 21:25:13 compute-0 NetworkManager[51722]: <info>  [1759872313.5136] checkpoint[0x556be5ec4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct 07 21:25:13 compute-0 NetworkManager[51722]: <info>  [1759872313.5138] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54502 uid=0 result="success"
Oct 07 21:25:13 compute-0 NetworkManager[51722]: <info>  [1759872313.8238] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54502 uid=0 result="success"
Oct 07 21:25:13 compute-0 NetworkManager[51722]: <info>  [1759872313.8249] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54502 uid=0 result="success"
Oct 07 21:25:13 compute-0 sudo[54835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oanqbrjnofgsefqlyhofhjllfrqsyphj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872313.4215267-570-172238127383260/AnsiballZ_async_status.py'
Oct 07 21:25:13 compute-0 sudo[54835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:14 compute-0 NetworkManager[51722]: <info>  [1759872314.0121] audit: op="networking-control" arg="global-dns-configuration" pid=54502 uid=0 result="success"
Oct 07 21:25:14 compute-0 NetworkManager[51722]: <info>  [1759872314.0149] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct 07 21:25:14 compute-0 NetworkManager[51722]: <info>  [1759872314.0181] audit: op="networking-control" arg="global-dns-configuration" pid=54502 uid=0 result="success"
Oct 07 21:25:14 compute-0 NetworkManager[51722]: <info>  [1759872314.0197] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54502 uid=0 result="success"
Oct 07 21:25:14 compute-0 python3.9[54837]: ansible-ansible.legacy.async_status Invoked with jid=j814274977975.54496 mode=status _async_dir=/root/.ansible_async
Oct 07 21:25:14 compute-0 sudo[54835]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:14 compute-0 NetworkManager[51722]: <info>  [1759872314.1460] checkpoint[0x556be5ec4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct 07 21:25:14 compute-0 NetworkManager[51722]: <info>  [1759872314.1464] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54502 uid=0 result="success"
Oct 07 21:25:14 compute-0 ansible-async_wrapper.py[54500]: Module complete (54500)
Oct 07 21:25:15 compute-0 ansible-async_wrapper.py[54499]: Done in kid B.
Oct 07 21:25:17 compute-0 sudo[54939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdwjtlkapsfjfnrhxpuytbgmbkyqcpjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872313.4215267-570-172238127383260/AnsiballZ_async_status.py'
Oct 07 21:25:17 compute-0 sudo[54939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:17 compute-0 python3.9[54941]: ansible-ansible.legacy.async_status Invoked with jid=j814274977975.54496 mode=status _async_dir=/root/.ansible_async
Oct 07 21:25:17 compute-0 sudo[54939]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:17 compute-0 sudo[55039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yraynmiiciwbccxsbunslfkpnrrasgzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872313.4215267-570-172238127383260/AnsiballZ_async_status.py'
Oct 07 21:25:17 compute-0 sudo[55039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:18 compute-0 python3.9[55041]: ansible-ansible.legacy.async_status Invoked with jid=j814274977975.54496 mode=cleanup _async_dir=/root/.ansible_async
Oct 07 21:25:18 compute-0 sudo[55039]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:18 compute-0 sudo[55191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czgioktveqjpvzxfnhmxlgvqodxqxwmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872318.303696-624-110653291579691/AnsiballZ_stat.py'
Oct 07 21:25:18 compute-0 sudo[55191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:18 compute-0 python3.9[55193]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:25:18 compute-0 sudo[55191]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:19 compute-0 sudo[55314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kerkcyfbrpkvtxidspoaqxplexylsnft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872318.303696-624-110653291579691/AnsiballZ_copy.py'
Oct 07 21:25:19 compute-0 sudo[55314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:19 compute-0 python3.9[55316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872318.303696-624-110653291579691/.source.returncode _original_basename=.ln__8c6f follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:19 compute-0 sudo[55314]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:19 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 07 21:25:20 compute-0 sudo[55469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbhltzctlmtujbfipucqffvsfdszdtdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872319.8529835-656-265292919016371/AnsiballZ_stat.py'
Oct 07 21:25:20 compute-0 sudo[55469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:20 compute-0 python3.9[55471]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:25:20 compute-0 sudo[55469]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:20 compute-0 sudo[55593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lezfgyxjfyeitqygqbhoaoxqnlvjzumn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872319.8529835-656-265292919016371/AnsiballZ_copy.py'
Oct 07 21:25:20 compute-0 sudo[55593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:21 compute-0 python3.9[55595]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872319.8529835-656-265292919016371/.source.cfg _original_basename=.szqh11yo follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:21 compute-0 sudo[55593]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:21 compute-0 sudo[55745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cciniqnrattntldoofpyzeogtnfzioyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872321.3367393-686-195938026782073/AnsiballZ_systemd.py'
Oct 07 21:25:21 compute-0 sudo[55745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:22 compute-0 python3.9[55747]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:25:22 compute-0 systemd[1]: Reloading Network Manager...
Oct 07 21:25:22 compute-0 NetworkManager[51722]: <info>  [1759872322.1421] audit: op="reload" arg="0" pid=55751 uid=0 result="success"
Oct 07 21:25:22 compute-0 NetworkManager[51722]: <info>  [1759872322.1428] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct 07 21:25:22 compute-0 systemd[1]: Reloaded Network Manager.
Oct 07 21:25:22 compute-0 sudo[55745]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:22 compute-0 sshd-session[47725]: Connection closed by 192.168.122.30 port 56414
Oct 07 21:25:22 compute-0 sshd-session[47722]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:25:22 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Oct 07 21:25:22 compute-0 systemd[1]: session-12.scope: Consumed 51.141s CPU time.
Oct 07 21:25:22 compute-0 systemd-logind[798]: Session 12 logged out. Waiting for processes to exit.
Oct 07 21:25:22 compute-0 systemd-logind[798]: Removed session 12.
Oct 07 21:25:28 compute-0 sshd-session[55782]: Accepted publickey for zuul from 192.168.122.30 port 59156 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:25:28 compute-0 systemd-logind[798]: New session 13 of user zuul.
Oct 07 21:25:28 compute-0 systemd[1]: Started Session 13 of User zuul.
Oct 07 21:25:28 compute-0 sshd-session[55782]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:25:29 compute-0 python3.9[55935]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:25:30 compute-0 python3.9[56090]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:25:31 compute-0 python3.9[56279]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:25:32 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 21:25:32 compute-0 sshd-session[55785]: Connection closed by 192.168.122.30 port 59156
Oct 07 21:25:32 compute-0 sshd-session[55782]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:25:32 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Oct 07 21:25:32 compute-0 systemd[1]: session-13.scope: Consumed 2.320s CPU time.
Oct 07 21:25:32 compute-0 systemd-logind[798]: Session 13 logged out. Waiting for processes to exit.
Oct 07 21:25:32 compute-0 systemd-logind[798]: Removed session 13.
Oct 07 21:25:37 compute-0 sshd-session[56308]: Accepted publickey for zuul from 192.168.122.30 port 43778 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:25:37 compute-0 systemd-logind[798]: New session 14 of user zuul.
Oct 07 21:25:37 compute-0 systemd[1]: Started Session 14 of User zuul.
Oct 07 21:25:37 compute-0 sshd-session[56308]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:25:38 compute-0 python3.9[56461]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:25:39 compute-0 python3.9[56615]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:25:40 compute-0 sudo[56770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxbqjleubuwqbpguqsnxdeecmdfpcaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872340.1590133-60-193401379438818/AnsiballZ_setup.py'
Oct 07 21:25:40 compute-0 sudo[56770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:40 compute-0 python3.9[56772]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:25:41 compute-0 sudo[56770]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:41 compute-0 sudo[56854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcavwlewjrvpgyvubknnrzmellocgnjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872340.1590133-60-193401379438818/AnsiballZ_dnf.py'
Oct 07 21:25:41 compute-0 sudo[56854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:41 compute-0 python3.9[56856]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:25:43 compute-0 sudo[56854]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:43 compute-0 sudo[57008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taemqwhmjctjjxexojzzoweucbuyqjna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872343.2752278-84-25124730827514/AnsiballZ_setup.py'
Oct 07 21:25:43 compute-0 sudo[57008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:43 compute-0 python3.9[57010]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:25:44 compute-0 sudo[57008]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:45 compute-0 sudo[57199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgzhzgsjvzaphluavlnghpfwzcfqgbjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872344.4790158-106-214760162980393/AnsiballZ_file.py'
Oct 07 21:25:45 compute-0 sudo[57199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:45 compute-0 python3.9[57201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:45 compute-0 sudo[57199]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:45 compute-0 sudo[57351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsppbvbloljlnzxokrcwplkhpdifxnel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872345.4763515-122-12700568974283/AnsiballZ_command.py'
Oct 07 21:25:45 compute-0 sudo[57351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:46 compute-0 python3.9[57353]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:25:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:25:46 compute-0 sudo[57351]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:46 compute-0 sudo[57514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxogsdoobwezpwnzxeftsrywwccewkmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872346.428392-138-47641066021502/AnsiballZ_stat.py'
Oct 07 21:25:46 compute-0 sudo[57514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:47 compute-0 python3.9[57516]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:25:47 compute-0 sudo[57514]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:47 compute-0 sudo[57592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdqhizmpvauwzjdagzrajjuvhawzmisg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872346.428392-138-47641066021502/AnsiballZ_file.py'
Oct 07 21:25:47 compute-0 sudo[57592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:47 compute-0 python3.9[57594]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:25:47 compute-0 sudo[57592]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:48 compute-0 sudo[57744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrxtfdbsnmddesohnrwsxmektmhpawfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872347.885526-162-110450387519727/AnsiballZ_stat.py'
Oct 07 21:25:48 compute-0 sudo[57744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:48 compute-0 python3.9[57746]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:25:48 compute-0 sudo[57744]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:48 compute-0 sudo[57822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqhjzkrejjhinmnyrlzglkyllutgtkcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872347.885526-162-110450387519727/AnsiballZ_file.py'
Oct 07 21:25:48 compute-0 sudo[57822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:48 compute-0 python3.9[57824]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:25:48 compute-0 sudo[57822]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:49 compute-0 sudo[57974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fznjnszvfduischljxokgvcjbsvnfoso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872349.0685952-188-945350450646/AnsiballZ_ini_file.py'
Oct 07 21:25:49 compute-0 sudo[57974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:49 compute-0 python3.9[57976]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:25:49 compute-0 sudo[57974]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:50 compute-0 sudo[58126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzjrxkogijgadjmwjpxaqakbwfdtcewo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872350.055287-188-255403323941665/AnsiballZ_ini_file.py'
Oct 07 21:25:50 compute-0 sudo[58126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:50 compute-0 python3.9[58128]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:25:50 compute-0 sudo[58126]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:51 compute-0 sudo[58278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybhvptazfcpqfrlcilvmalisuzovcswz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872350.7928908-188-106909586665731/AnsiballZ_ini_file.py'
Oct 07 21:25:51 compute-0 sudo[58278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:51 compute-0 python3.9[58280]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:25:51 compute-0 sudo[58278]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:51 compute-0 sudo[58430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzpitmotjqocaocrjmkycbcuocstmrfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872351.5702598-188-114740129276091/AnsiballZ_ini_file.py'
Oct 07 21:25:51 compute-0 sudo[58430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:52 compute-0 python3.9[58432]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:25:52 compute-0 sudo[58430]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:52 compute-0 sudo[58582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzwaysygpvtcdxfueiswvcjyyjfqliyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872352.4876254-250-83102426412122/AnsiballZ_dnf.py'
Oct 07 21:25:52 compute-0 sudo[58582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:53 compute-0 python3.9[58584]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:25:54 compute-0 sudo[58582]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:55 compute-0 sudo[58735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffjvuaxqkqyejnzvszwsaebvhuaksmip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872354.8691313-272-173015928316305/AnsiballZ_setup.py'
Oct 07 21:25:55 compute-0 sudo[58735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:55 compute-0 python3.9[58737]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:25:55 compute-0 sudo[58735]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:55 compute-0 sudo[58889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecyvztaqddswmtkmecyuxpljbnqxyqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872355.716965-288-104617113715587/AnsiballZ_stat.py'
Oct 07 21:25:55 compute-0 sudo[58889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:56 compute-0 python3.9[58891]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:25:56 compute-0 sudo[58889]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:56 compute-0 sudo[59041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkkhrfrcbbrrurbeoksrjldxequhdtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872356.5030663-306-276265765419613/AnsiballZ_stat.py'
Oct 07 21:25:56 compute-0 sudo[59041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:57 compute-0 python3.9[59043]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:25:57 compute-0 sudo[59041]: pam_unix(sudo:session): session closed for user root
Oct 07 21:25:57 compute-0 sudo[59193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqoyvovslbfzgabjykpibflonzyvauod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872357.449766-326-14656920427980/AnsiballZ_service_facts.py'
Oct 07 21:25:57 compute-0 sudo[59193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:25:58 compute-0 python3.9[59195]: ansible-service_facts Invoked
Oct 07 21:25:58 compute-0 network[59212]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:25:58 compute-0 network[59213]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:25:58 compute-0 network[59214]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:26:00 compute-0 sudo[59193]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:01 compute-0 sudo[59499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnyhzdcuwcdwnxqwozvhxdlwsjjcijgk ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759872361.6291323-352-78088661282872/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759872361.6291323-352-78088661282872/args'
Oct 07 21:26:01 compute-0 sudo[59499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:02 compute-0 sudo[59499]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:02 compute-0 sudo[59666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzddknuadclphkwctgmorraxgcmkbrvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872362.357567-374-189298926860507/AnsiballZ_dnf.py'
Oct 07 21:26:02 compute-0 sudo[59666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:02 compute-0 python3.9[59668]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:26:04 compute-0 sudo[59666]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:05 compute-0 sudo[59819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkugsnrnfuuaeltpqjmlexjzckrvupxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872364.4684756-400-12850128549784/AnsiballZ_package_facts.py'
Oct 07 21:26:05 compute-0 sudo[59819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:05 compute-0 python3.9[59821]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 07 21:26:05 compute-0 sudo[59819]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:06 compute-0 sudo[59973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmgngxykzeqpmiodcmjpxtyixqfkkgak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872366.365892-420-44401462938877/AnsiballZ_stat.py'
Oct 07 21:26:06 compute-0 sudo[59973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:06 compute-0 python3.9[59975]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:06 compute-0 sudo[59973]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:07 compute-0 sshd-session[59870]: Received disconnect from 91.224.92.108 port 25118:11:  [preauth]
Oct 07 21:26:07 compute-0 sshd-session[59870]: Disconnected from authenticating user root 91.224.92.108 port 25118 [preauth]
Oct 07 21:26:07 compute-0 sudo[60098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntkayppolzkidafffxgybtpsyoqbhmif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872366.365892-420-44401462938877/AnsiballZ_copy.py'
Oct 07 21:26:07 compute-0 sudo[60098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:07 compute-0 python3.9[60100]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872366.365892-420-44401462938877/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:07 compute-0 sudo[60098]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:08 compute-0 sudo[60252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scvxcxanwroovzkuukadldteburtniyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872367.8893213-450-267925902880465/AnsiballZ_stat.py'
Oct 07 21:26:08 compute-0 sudo[60252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:08 compute-0 python3.9[60254]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:08 compute-0 sudo[60252]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:08 compute-0 sudo[60377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkvhprvnzpjlhhenlkprauiabdzauzpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872367.8893213-450-267925902880465/AnsiballZ_copy.py'
Oct 07 21:26:08 compute-0 sudo[60377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:08 compute-0 python3.9[60379]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872367.8893213-450-267925902880465/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:08 compute-0 sudo[60377]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:10 compute-0 sudo[60531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyigqpxbvftwwlshqgsbxrgcklwujtew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872369.549661-492-41705402830713/AnsiballZ_lineinfile.py'
Oct 07 21:26:10 compute-0 sudo[60531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:10 compute-0 python3.9[60533]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:10 compute-0 sudo[60531]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:11 compute-0 sudo[60685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykigidoxsijorozshhhoyydxmexvslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872371.026519-522-188325653803898/AnsiballZ_setup.py'
Oct 07 21:26:11 compute-0 sudo[60685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:11 compute-0 python3.9[60687]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:26:11 compute-0 sudo[60685]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:12 compute-0 sudo[60769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgirtwowhtlsjdawniekdzudvyawdkwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872371.026519-522-188325653803898/AnsiballZ_systemd.py'
Oct 07 21:26:12 compute-0 sudo[60769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:12 compute-0 python3.9[60771]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:26:12 compute-0 sudo[60769]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:13 compute-0 sudo[60923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqetmitolakfjmvofvryhyiaedvqepsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872373.5296085-554-14256664122163/AnsiballZ_setup.py'
Oct 07 21:26:13 compute-0 sudo[60923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:14 compute-0 python3.9[60925]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:26:14 compute-0 sudo[60923]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:14 compute-0 sudo[61007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zktxckihqaobrqzyahwcgjthrxslczdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872373.5296085-554-14256664122163/AnsiballZ_systemd.py'
Oct 07 21:26:14 compute-0 sudo[61007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:14 compute-0 python3.9[61009]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:26:14 compute-0 chronyd[788]: chronyd exiting
Oct 07 21:26:14 compute-0 systemd[1]: Stopping NTP client/server...
Oct 07 21:26:15 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Oct 07 21:26:15 compute-0 systemd[1]: Stopped NTP client/server.
Oct 07 21:26:15 compute-0 systemd[1]: Starting NTP client/server...
Oct 07 21:26:15 compute-0 chronyd[61017]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 07 21:26:15 compute-0 chronyd[61017]: Frequency -28.725 +/- 0.076 ppm read from /var/lib/chrony/drift
Oct 07 21:26:15 compute-0 chronyd[61017]: Loaded seccomp filter (level 2)
Oct 07 21:26:15 compute-0 systemd[1]: Started NTP client/server.
Oct 07 21:26:15 compute-0 sudo[61007]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:15 compute-0 sshd-session[56311]: Connection closed by 192.168.122.30 port 43778
Oct 07 21:26:15 compute-0 sshd-session[56308]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:26:15 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Oct 07 21:26:15 compute-0 systemd[1]: session-14.scope: Consumed 25.786s CPU time.
Oct 07 21:26:15 compute-0 systemd-logind[798]: Session 14 logged out. Waiting for processes to exit.
Oct 07 21:26:15 compute-0 systemd-logind[798]: Removed session 14.
Oct 07 21:26:21 compute-0 sshd-session[61043]: Accepted publickey for zuul from 192.168.122.30 port 36542 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:26:21 compute-0 systemd-logind[798]: New session 15 of user zuul.
Oct 07 21:26:21 compute-0 systemd[1]: Started Session 15 of User zuul.
Oct 07 21:26:21 compute-0 sshd-session[61043]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:26:22 compute-0 python3.9[61196]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:26:23 compute-0 sudo[61350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajbcplzvuyvbfjxnrisdugynknzbfhkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872382.768019-46-54017796573385/AnsiballZ_file.py'
Oct 07 21:26:23 compute-0 sudo[61350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:23 compute-0 python3.9[61352]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:23 compute-0 sudo[61350]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:24 compute-0 sudo[61525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgzlcsuoaixrzvyrrsawunogbuiicxmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872383.60026-62-230313746359607/AnsiballZ_stat.py'
Oct 07 21:26:24 compute-0 sudo[61525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:24 compute-0 python3.9[61527]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:24 compute-0 sudo[61525]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:24 compute-0 sudo[61603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwbptsstjvkmqdhfiydvuvmsqppawxew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872383.60026-62-230313746359607/AnsiballZ_file.py'
Oct 07 21:26:24 compute-0 sudo[61603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:24 compute-0 python3.9[61605]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.xtm0dz3d recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:24 compute-0 sudo[61603]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:25 compute-0 sudo[61755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnydgifznsiopisirpixklxjdsxepavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872385.1921515-102-180909468724304/AnsiballZ_stat.py'
Oct 07 21:26:25 compute-0 sudo[61755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:25 compute-0 python3.9[61757]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:25 compute-0 sudo[61755]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:26 compute-0 sudo[61878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgfxdraogcvkcgnpbxtspgfvdctrwyix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872385.1921515-102-180909468724304/AnsiballZ_copy.py'
Oct 07 21:26:26 compute-0 sudo[61878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:26 compute-0 python3.9[61880]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872385.1921515-102-180909468724304/.source _original_basename=.kcdcy5og follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:26 compute-0 sudo[61878]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:27 compute-0 sudo[62030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbvjjkgyhcbinahhssrknmakajicczad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872386.764267-134-276129663554100/AnsiballZ_file.py'
Oct 07 21:26:27 compute-0 sudo[62030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:27 compute-0 python3.9[62032]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:26:27 compute-0 sudo[62030]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:27 compute-0 sudo[62182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzeehqpvfetopzsutnekndcufphhmqkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872387.5709405-150-55797462179113/AnsiballZ_stat.py'
Oct 07 21:26:27 compute-0 sudo[62182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:28 compute-0 python3.9[62184]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:28 compute-0 sudo[62182]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:28 compute-0 sudo[62305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxlwlnkjkuinfrauzuwovkvcrqbebtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872387.5709405-150-55797462179113/AnsiballZ_copy.py'
Oct 07 21:26:28 compute-0 sudo[62305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:28 compute-0 python3.9[62307]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872387.5709405-150-55797462179113/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:26:28 compute-0 sudo[62305]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:29 compute-0 sudo[62457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdzmqxcnpzwpgcryozakawbqwewvgidp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872388.8148503-150-261937331670830/AnsiballZ_stat.py'
Oct 07 21:26:29 compute-0 sudo[62457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:29 compute-0 python3.9[62459]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:29 compute-0 sudo[62457]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:29 compute-0 sudo[62580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbxedngxxuzuuibizecridwuvfdgezjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872388.8148503-150-261937331670830/AnsiballZ_copy.py'
Oct 07 21:26:29 compute-0 sudo[62580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:29 compute-0 python3.9[62582]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872388.8148503-150-261937331670830/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:26:29 compute-0 sudo[62580]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:30 compute-0 sudo[62732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztgqolegmaphfubnblkznsqimlphwnxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872390.2940018-208-121744204876823/AnsiballZ_file.py'
Oct 07 21:26:30 compute-0 sudo[62732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:30 compute-0 python3.9[62734]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:30 compute-0 sudo[62732]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:31 compute-0 sudo[62884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgsgqezfzmfdyiiswofzwyiwnoazjkvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872391.099659-224-185829027191061/AnsiballZ_stat.py'
Oct 07 21:26:31 compute-0 sudo[62884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:31 compute-0 python3.9[62886]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:31 compute-0 sudo[62884]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:31 compute-0 sudo[63007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujnsodngxplutjbzomzqwuammzkhyyke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872391.099659-224-185829027191061/AnsiballZ_copy.py'
Oct 07 21:26:31 compute-0 sudo[63007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:32 compute-0 python3.9[63009]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872391.099659-224-185829027191061/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:32 compute-0 sudo[63007]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:32 compute-0 sudo[63159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewpktctfdcklakmsryfrabxszwfddvll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872392.566866-254-257678319690092/AnsiballZ_stat.py'
Oct 07 21:26:32 compute-0 sudo[63159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:33 compute-0 python3.9[63161]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:33 compute-0 sudo[63159]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:33 compute-0 sudo[63282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eilkqigulilccmvmeeqtxfwuosslceoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872392.566866-254-257678319690092/AnsiballZ_copy.py'
Oct 07 21:26:33 compute-0 sudo[63282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:33 compute-0 python3.9[63284]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872392.566866-254-257678319690092/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:33 compute-0 sudo[63282]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:34 compute-0 sudo[63434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxpgcuddsnunjxozkjnhvnbjjmvhgijq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872394.1645362-284-63826490744568/AnsiballZ_systemd.py'
Oct 07 21:26:34 compute-0 sudo[63434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:35 compute-0 python3.9[63436]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:26:35 compute-0 systemd[1]: Reloading.
Oct 07 21:26:35 compute-0 systemd-rc-local-generator[63458]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:26:35 compute-0 systemd-sysv-generator[63465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:26:35 compute-0 systemd[1]: Reloading.
Oct 07 21:26:35 compute-0 systemd-rc-local-generator[63497]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:26:35 compute-0 systemd-sysv-generator[63501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:26:35 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Oct 07 21:26:35 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Oct 07 21:26:35 compute-0 sudo[63434]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:36 compute-0 sudo[63661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fakzpgxfzhbwnkdcmpuqppjlvfudnevo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872395.932207-300-113936433510212/AnsiballZ_stat.py'
Oct 07 21:26:36 compute-0 sudo[63661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:36 compute-0 python3.9[63663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:36 compute-0 sudo[63661]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:36 compute-0 sudo[63784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-werucocapbbunmmpbxilvgftlwoivjou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872395.932207-300-113936433510212/AnsiballZ_copy.py'
Oct 07 21:26:36 compute-0 sudo[63784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:36 compute-0 python3.9[63786]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872395.932207-300-113936433510212/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:36 compute-0 sudo[63784]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:37 compute-0 sudo[63936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yydwxljsupmjfeijgiowunjacpaykcqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872397.5138862-330-254858608367118/AnsiballZ_stat.py'
Oct 07 21:26:37 compute-0 sudo[63936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:37 compute-0 python3.9[63938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:37 compute-0 sudo[63936]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:38 compute-0 sudo[64059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhuswiabaquhmtwavqbmxahjgkyzerft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872397.5138862-330-254858608367118/AnsiballZ_copy.py'
Oct 07 21:26:38 compute-0 sudo[64059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:38 compute-0 python3.9[64061]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872397.5138862-330-254858608367118/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:38 compute-0 sudo[64059]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:39 compute-0 sudo[64211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzavezqqfvbvwsqfpyjorufggaknlgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872398.942298-360-27269517625415/AnsiballZ_systemd.py'
Oct 07 21:26:39 compute-0 sudo[64211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:39 compute-0 python3.9[64213]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:26:39 compute-0 systemd[1]: Reloading.
Oct 07 21:26:39 compute-0 systemd-rc-local-generator[64242]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:26:39 compute-0 systemd-sysv-generator[64246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:26:39 compute-0 systemd[1]: Reloading.
Oct 07 21:26:40 compute-0 systemd-rc-local-generator[64284]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:26:40 compute-0 systemd-sysv-generator[64287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:26:40 compute-0 systemd[1]: Starting Create netns directory...
Oct 07 21:26:40 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 07 21:26:40 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 07 21:26:40 compute-0 systemd[1]: Finished Create netns directory.
Oct 07 21:26:40 compute-0 sudo[64211]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:41 compute-0 python3.9[64442]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:26:41 compute-0 network[64459]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:26:41 compute-0 network[64460]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:26:41 compute-0 network[64461]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:26:44 compute-0 sudo[64723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxvyrnnzymqjwjsaqdqixcdkcdwynxip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872404.4492297-392-205941014170925/AnsiballZ_systemd.py'
Oct 07 21:26:44 compute-0 sudo[64723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:45 compute-0 python3.9[64725]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:26:46 compute-0 systemd[1]: Reloading.
Oct 07 21:26:46 compute-0 systemd-rc-local-generator[64755]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:26:46 compute-0 systemd-sysv-generator[64758]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:26:46 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Oct 07 21:26:46 compute-0 iptables.init[64765]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct 07 21:26:46 compute-0 iptables.init[64765]: iptables: Flushing firewall rules: [  OK  ]
Oct 07 21:26:46 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Oct 07 21:26:46 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Oct 07 21:26:46 compute-0 sudo[64723]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:47 compute-0 sudo[64959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-norjbathsdnttrgqotetfjcjzpyuvadh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872407.0878115-392-16690104427511/AnsiballZ_systemd.py'
Oct 07 21:26:47 compute-0 sudo[64959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:47 compute-0 python3.9[64961]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:26:47 compute-0 sudo[64959]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:48 compute-0 sudo[65113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlpshsyvtevsmetpyatjojvbclumzkbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872408.0415878-424-264848295013936/AnsiballZ_systemd.py'
Oct 07 21:26:48 compute-0 sudo[65113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:48 compute-0 python3.9[65115]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:26:48 compute-0 systemd[1]: Reloading.
Oct 07 21:26:48 compute-0 systemd-rc-local-generator[65140]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:26:48 compute-0 systemd-sysv-generator[65147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:26:48 compute-0 systemd[1]: Starting Netfilter Tables...
Oct 07 21:26:48 compute-0 systemd[1]: Finished Netfilter Tables.
Oct 07 21:26:48 compute-0 sudo[65113]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:49 compute-0 sudo[65305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpybsacziqfddpvmckfjtouiaeldnbwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872409.1758327-440-186595769195442/AnsiballZ_command.py'
Oct 07 21:26:49 compute-0 sudo[65305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:49 compute-0 python3.9[65307]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:26:49 compute-0 sudo[65305]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:50 compute-0 sudo[65458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnsfwmrlgacpchmdoasgxqjqmtcsmfxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872410.510351-468-88843609908724/AnsiballZ_stat.py'
Oct 07 21:26:50 compute-0 sudo[65458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:51 compute-0 python3.9[65460]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:26:51 compute-0 sudo[65458]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:51 compute-0 sudo[65583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnvglzlgxkzqvqdzseuwolowcaxvlsqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872410.510351-468-88843609908724/AnsiballZ_copy.py'
Oct 07 21:26:51 compute-0 sudo[65583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:26:51 compute-0 python3.9[65585]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872410.510351-468-88843609908724/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:26:51 compute-0 sudo[65583]: pam_unix(sudo:session): session closed for user root
Oct 07 21:26:52 compute-0 python3.9[65736]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:26:52 compute-0 polkitd[6157]: Registered Authentication Agent for unix-process:65738:214643 (system bus name :1.555 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 07 21:27:04 compute-0 sshd-session[65752]: Invalid user ts3 from 103.115.24.11 port 38714
Oct 07 21:27:04 compute-0 sshd-session[65752]: Received disconnect from 103.115.24.11 port 38714:11: Bye Bye [preauth]
Oct 07 21:27:04 compute-0 sshd-session[65752]: Disconnected from invalid user ts3 103.115.24.11 port 38714 [preauth]
Oct 07 21:27:17 compute-0 polkitd[6157]: Unregistered Authentication Agent for unix-process:65738:214643 (system bus name :1.555, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 07 21:27:17 compute-0 polkit-agent-helper-1[65750]: pam_unix(polkit-1:auth): conversation failed
Oct 07 21:27:17 compute-0 polkit-agent-helper-1[65750]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Oct 07 21:27:17 compute-0 polkitd[6157]: Operator of unix-process:65738:214643 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.554 [<unknown>] (owned by unix-user:zuul)
Oct 07 21:27:18 compute-0 sshd-session[61046]: Connection closed by 192.168.122.30 port 36542
Oct 07 21:27:18 compute-0 sshd-session[61043]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:27:18 compute-0 systemd-logind[798]: Session 15 logged out. Waiting for processes to exit.
Oct 07 21:27:18 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Oct 07 21:27:18 compute-0 systemd[1]: session-15.scope: Consumed 19.241s CPU time.
Oct 07 21:27:18 compute-0 systemd-logind[798]: Removed session 15.
Oct 07 21:27:30 compute-0 sshd-session[65779]: Accepted publickey for zuul from 192.168.122.30 port 50902 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:27:30 compute-0 systemd-logind[798]: New session 16 of user zuul.
Oct 07 21:27:30 compute-0 systemd[1]: Started Session 16 of User zuul.
Oct 07 21:27:30 compute-0 sshd-session[65779]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:27:31 compute-0 python3.9[65932]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:27:32 compute-0 sudo[66086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyogcjjrgjqgfvyuuuuykpqafeurrtzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872452.4126587-46-19322428999473/AnsiballZ_file.py'
Oct 07 21:27:32 compute-0 sudo[66086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:33 compute-0 python3.9[66088]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:33 compute-0 sudo[66086]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:33 compute-0 sudo[66261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzicaomsbondokczshizwaocpkcwudbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872453.3996847-62-89064038818899/AnsiballZ_stat.py'
Oct 07 21:27:33 compute-0 sudo[66261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:34 compute-0 python3.9[66263]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:34 compute-0 sudo[66261]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:34 compute-0 sudo[66339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzqhlfkvtpgcfkzwsmdxcrbzlzbzqxjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872453.3996847-62-89064038818899/AnsiballZ_file.py'
Oct 07 21:27:34 compute-0 sudo[66339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:34 compute-0 python3.9[66341]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.s5j57lo_ recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:34 compute-0 sudo[66339]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:35 compute-0 sudo[66491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrroawhedarodonqmhhjvwyutnqjkgaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872455.1577728-102-235168791648359/AnsiballZ_stat.py'
Oct 07 21:27:35 compute-0 sudo[66491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:35 compute-0 python3.9[66493]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:35 compute-0 sudo[66491]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:36 compute-0 sudo[66569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjwglabiwtyttuarsdjlldoegbcsshmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872455.1577728-102-235168791648359/AnsiballZ_file.py'
Oct 07 21:27:36 compute-0 sudo[66569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:36 compute-0 python3.9[66571]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.b01whqk5 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:36 compute-0 sudo[66569]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:36 compute-0 sudo[66721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvqxhwltfcghixzrekbkavqbcvdhiqxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872456.5162773-128-134778743535658/AnsiballZ_file.py'
Oct 07 21:27:36 compute-0 sudo[66721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:37 compute-0 python3.9[66723]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:27:37 compute-0 sudo[66721]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:37 compute-0 sudo[66873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyhlnzvxtcjrheelkwmsxfwgavcjeatq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872457.2472937-144-6071614194975/AnsiballZ_stat.py'
Oct 07 21:27:37 compute-0 sudo[66873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:37 compute-0 python3.9[66875]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:37 compute-0 sudo[66873]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:38 compute-0 sudo[66951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzurfkvojfuljpcmhhkvfstmhoflfngo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872457.2472937-144-6071614194975/AnsiballZ_file.py'
Oct 07 21:27:38 compute-0 sudo[66951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:38 compute-0 python3.9[66953]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:27:38 compute-0 sudo[66951]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:38 compute-0 sudo[67103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmzwvowmhgeuknehtlytehgnhznzbjvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872458.5059037-144-89460193518065/AnsiballZ_stat.py'
Oct 07 21:27:38 compute-0 sudo[67103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:39 compute-0 python3.9[67105]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:39 compute-0 sudo[67103]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:39 compute-0 sudo[67181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwrbwcaejpdtvqklyvgwwprljrrexcuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872458.5059037-144-89460193518065/AnsiballZ_file.py'
Oct 07 21:27:39 compute-0 sudo[67181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:39 compute-0 python3.9[67183]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:27:39 compute-0 sudo[67181]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:40 compute-0 sudo[67333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozodeqgeiikhhtyimbgkobrcihxprzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872459.8550584-190-44533744313536/AnsiballZ_file.py'
Oct 07 21:27:40 compute-0 sudo[67333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:40 compute-0 python3.9[67335]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:40 compute-0 sudo[67333]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:40 compute-0 sudo[67485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldwtjgwbpqttksrdbzuftrmbboztlsne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872460.6089582-206-11169305683168/AnsiballZ_stat.py'
Oct 07 21:27:40 compute-0 sudo[67485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:41 compute-0 python3.9[67487]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:41 compute-0 sudo[67485]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:41 compute-0 sudo[67563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdlyjtxlsjgthsyddxbuqvtzlqjgzngu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872460.6089582-206-11169305683168/AnsiballZ_file.py'
Oct 07 21:27:41 compute-0 sudo[67563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:41 compute-0 python3.9[67565]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:41 compute-0 sudo[67563]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:42 compute-0 sudo[67715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdwsqiwuyuvpcvjjuxgcczrelleakenu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872462.0100198-230-66372736213451/AnsiballZ_stat.py'
Oct 07 21:27:42 compute-0 sudo[67715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:42 compute-0 python3.9[67717]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:42 compute-0 sudo[67715]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:42 compute-0 sudo[67793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymascyjrfyasflopgjkclazhhckiaupa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872462.0100198-230-66372736213451/AnsiballZ_file.py'
Oct 07 21:27:42 compute-0 sudo[67793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:43 compute-0 python3.9[67795]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:43 compute-0 sudo[67793]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:44 compute-0 sudo[67945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgiuyzlewxdpzjfusgzwgcvqzppcsyjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872463.3819818-254-169938971913360/AnsiballZ_systemd.py'
Oct 07 21:27:44 compute-0 sudo[67945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:44 compute-0 python3.9[67947]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:27:44 compute-0 systemd[1]: Reloading.
Oct 07 21:27:44 compute-0 systemd-rc-local-generator[67972]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:27:44 compute-0 systemd-sysv-generator[67976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:27:44 compute-0 sudo[67945]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:45 compute-0 sudo[68133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frcbuqemanfpapznjunyjddwcwkasbuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872464.8889089-270-69089677033043/AnsiballZ_stat.py'
Oct 07 21:27:45 compute-0 sudo[68133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:45 compute-0 python3.9[68135]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:45 compute-0 sudo[68133]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:45 compute-0 sudo[68211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaiqwconvrzwuvzfhmrrjlrxsfwmwfai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872464.8889089-270-69089677033043/AnsiballZ_file.py'
Oct 07 21:27:45 compute-0 sudo[68211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:45 compute-0 python3.9[68213]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:45 compute-0 sudo[68211]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:46 compute-0 sudo[68363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjboigofyupsdujleqdtfbaunyvlihic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872466.1442797-294-160934525588386/AnsiballZ_stat.py'
Oct 07 21:27:46 compute-0 sudo[68363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:46 compute-0 python3.9[68365]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:46 compute-0 sudo[68363]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:47 compute-0 sudo[68441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpjhyzpizrvvalimnwyyafvdkvapwcle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872466.1442797-294-160934525588386/AnsiballZ_file.py'
Oct 07 21:27:47 compute-0 sudo[68441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:47 compute-0 python3.9[68443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:47 compute-0 sudo[68441]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:47 compute-0 sudo[68593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaektefcruhwwbccougsfwbqishrmjwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872467.4701228-318-246045999903374/AnsiballZ_systemd.py'
Oct 07 21:27:47 compute-0 sudo[68593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:48 compute-0 python3.9[68595]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:27:48 compute-0 systemd[1]: Reloading.
Oct 07 21:27:48 compute-0 systemd-rc-local-generator[68622]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:27:48 compute-0 systemd-sysv-generator[68625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:27:48 compute-0 systemd[1]: Starting Create netns directory...
Oct 07 21:27:48 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 07 21:27:48 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 07 21:27:48 compute-0 systemd[1]: Finished Create netns directory.
Oct 07 21:27:48 compute-0 sudo[68593]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:49 compute-0 python3.9[68785]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:27:49 compute-0 network[68802]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:27:49 compute-0 network[68803]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:27:49 compute-0 network[68804]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:27:56 compute-0 sudo[69065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwudjwcqhzyfmssrlojgjilvllmnkbwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872476.250176-370-166638514186701/AnsiballZ_stat.py'
Oct 07 21:27:56 compute-0 sudo[69065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:56 compute-0 python3.9[69067]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:56 compute-0 sudo[69065]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:57 compute-0 sudo[69143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzamptnmpjnffonmyqfhmxrbjhormtdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872476.250176-370-166638514186701/AnsiballZ_file.py'
Oct 07 21:27:57 compute-0 sudo[69143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:57 compute-0 python3.9[69145]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:57 compute-0 sudo[69143]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:57 compute-0 sudo[69295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iopoobyhvndukrvodycsjwjwxewnitpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872477.635941-396-149000693138529/AnsiballZ_file.py'
Oct 07 21:27:57 compute-0 sudo[69295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:58 compute-0 python3.9[69297]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:58 compute-0 sudo[69295]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:58 compute-0 sudo[69447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlzrwarxuiimhnyexfcjpjbeaechpfdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872478.3971279-412-13016317267922/AnsiballZ_stat.py'
Oct 07 21:27:58 compute-0 sudo[69447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:58 compute-0 python3.9[69449]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:27:58 compute-0 sudo[69447]: pam_unix(sudo:session): session closed for user root
Oct 07 21:27:59 compute-0 sudo[69570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njpdmxsblskycqoakuiwaadqnrizoxlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872478.3971279-412-13016317267922/AnsiballZ_copy.py'
Oct 07 21:27:59 compute-0 sudo[69570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:27:59 compute-0 python3.9[69572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872478.3971279-412-13016317267922/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:27:59 compute-0 sudo[69570]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:00 compute-0 sudo[69722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgkyrogvzufihehcmcveavoenlxonmcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872480.0485349-448-1240535375344/AnsiballZ_timezone.py'
Oct 07 21:28:00 compute-0 sudo[69722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:00 compute-0 python3.9[69724]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 07 21:28:00 compute-0 systemd[1]: Starting Time & Date Service...
Oct 07 21:28:00 compute-0 systemd[1]: Started Time & Date Service.
Oct 07 21:28:00 compute-0 sudo[69722]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:01 compute-0 sudo[69878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkmctzxvonzknpgoddkctzmhhjempmzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872481.2364395-466-202969975971964/AnsiballZ_file.py'
Oct 07 21:28:01 compute-0 sudo[69878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:01 compute-0 python3.9[69880]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:01 compute-0 sudo[69878]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:02 compute-0 sudo[70030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceaxvtxnuawpcaushhwwpguywiiswgsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872482.046771-482-83222574145204/AnsiballZ_stat.py'
Oct 07 21:28:02 compute-0 sudo[70030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:02 compute-0 python3.9[70032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:02 compute-0 sudo[70030]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:02 compute-0 sudo[70153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eagoanknvksohpsuszsjetjivkgjnypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872482.046771-482-83222574145204/AnsiballZ_copy.py'
Oct 07 21:28:02 compute-0 sudo[70153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:03 compute-0 python3.9[70155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872482.046771-482-83222574145204/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:03 compute-0 sudo[70153]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:03 compute-0 sudo[70305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulqqwqsabnqqedqrowyoqpbjynuisvxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872483.5500321-512-6673325163039/AnsiballZ_stat.py'
Oct 07 21:28:03 compute-0 sudo[70305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:04 compute-0 python3.9[70307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:04 compute-0 sudo[70305]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:04 compute-0 sudo[70428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fewlocdruepltwzixgljxhaufqdbdirq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872483.5500321-512-6673325163039/AnsiballZ_copy.py'
Oct 07 21:28:04 compute-0 sudo[70428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:04 compute-0 python3.9[70430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872483.5500321-512-6673325163039/.source.yaml _original_basename=.aplpq0ky follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:04 compute-0 sudo[70428]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:05 compute-0 sudo[70580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgnzujcupzawqvvxscwmjhtdzejivyiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872485.1821277-542-15039753152355/AnsiballZ_stat.py'
Oct 07 21:28:05 compute-0 sudo[70580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:05 compute-0 python3.9[70582]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:05 compute-0 sudo[70580]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:06 compute-0 sudo[70703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrfmrulvrgijbhbglxeiimdosxadxgxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872485.1821277-542-15039753152355/AnsiballZ_copy.py'
Oct 07 21:28:06 compute-0 sudo[70703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:06 compute-0 python3.9[70705]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872485.1821277-542-15039753152355/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:06 compute-0 sudo[70703]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:07 compute-0 sudo[70855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbcwfzrggkgpgdlpkbrrrdudknxqlaui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872486.7650714-572-126218752021637/AnsiballZ_command.py'
Oct 07 21:28:07 compute-0 sudo[70855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:07 compute-0 python3.9[70857]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:28:07 compute-0 sudo[70855]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:08 compute-0 sudo[71008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvtkrkrfsgbhfdpaccpiwvnycjmwuszv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872487.768488-588-42686937735939/AnsiballZ_command.py'
Oct 07 21:28:08 compute-0 sudo[71008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:08 compute-0 python3.9[71010]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:28:08 compute-0 sudo[71008]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:09 compute-0 sudo[71161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poubnbtckisixopusruvncujmunbyrem ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759872488.6417663-604-57570510216544/AnsiballZ_edpm_nftables_from_files.py'
Oct 07 21:28:09 compute-0 sudo[71161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:09 compute-0 python3[71163]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 07 21:28:09 compute-0 sudo[71161]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:10 compute-0 sudo[71313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chqwhuopwrjzzitffzabpkdbdvdyqreg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872489.6734805-620-99051703692036/AnsiballZ_stat.py'
Oct 07 21:28:10 compute-0 sudo[71313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:10 compute-0 python3.9[71315]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:10 compute-0 sudo[71313]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:10 compute-0 sudo[71436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnfeovlsvyhpxhqdwoostdasuyfjixwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872489.6734805-620-99051703692036/AnsiballZ_copy.py'
Oct 07 21:28:10 compute-0 sudo[71436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:10 compute-0 python3.9[71438]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872489.6734805-620-99051703692036/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:10 compute-0 sudo[71436]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:11 compute-0 sudo[71588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouoqaaftysqkerhbalipigdiijrhcnan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872491.0848963-650-166129986111330/AnsiballZ_stat.py'
Oct 07 21:28:11 compute-0 sudo[71588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:11 compute-0 python3.9[71590]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:11 compute-0 sudo[71588]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:12 compute-0 sudo[71711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlygxpungqnaxtcgfzlqwtlizkselbfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872491.0848963-650-166129986111330/AnsiballZ_copy.py'
Oct 07 21:28:12 compute-0 sudo[71711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:12 compute-0 python3.9[71713]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872491.0848963-650-166129986111330/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:12 compute-0 sudo[71711]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:13 compute-0 sudo[71863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gexihmfbvxjkmmypwrqyepxggccjvhfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872492.7268045-680-21534775337871/AnsiballZ_stat.py'
Oct 07 21:28:13 compute-0 sudo[71863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:13 compute-0 python3.9[71865]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:13 compute-0 sudo[71863]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:13 compute-0 sudo[71986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpewrpmlhnodnklkiwstirkdghkqaocu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872492.7268045-680-21534775337871/AnsiballZ_copy.py'
Oct 07 21:28:13 compute-0 sudo[71986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:13 compute-0 python3.9[71988]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872492.7268045-680-21534775337871/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:14 compute-0 sudo[71986]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:14 compute-0 sudo[72138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibwmghllmunqungukmcavdkkifbjacyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872494.360891-710-22747618601037/AnsiballZ_stat.py'
Oct 07 21:28:14 compute-0 sudo[72138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:14 compute-0 python3.9[72140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:14 compute-0 sudo[72138]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:15 compute-0 sudo[72261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kptmerwwvawroiyixnzhcxuytilbypiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872494.360891-710-22747618601037/AnsiballZ_copy.py'
Oct 07 21:28:15 compute-0 sudo[72261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:15 compute-0 python3.9[72263]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872494.360891-710-22747618601037/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:15 compute-0 sudo[72261]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:16 compute-0 sudo[72414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ethzgoaymwrmrvawfxnzeaqocwtrjmbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872495.9148717-740-386379188182/AnsiballZ_stat.py'
Oct 07 21:28:16 compute-0 sudo[72414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:16 compute-0 python3.9[72416]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:28:16 compute-0 sudo[72414]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:16 compute-0 sudo[72537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiafhsyedzeccjtoinemkzrzdaxvyijm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872495.9148717-740-386379188182/AnsiballZ_copy.py'
Oct 07 21:28:16 compute-0 sudo[72537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:17 compute-0 python3.9[72539]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872495.9148717-740-386379188182/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:17 compute-0 sudo[72537]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:17 compute-0 sudo[72690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlkpczcvraeofzykrhelhjokeyqjrewh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872497.5873456-770-155981921484371/AnsiballZ_file.py'
Oct 07 21:28:17 compute-0 sudo[72690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:18 compute-0 python3.9[72692]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:18 compute-0 sudo[72690]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:18 compute-0 sudo[72842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rblgruhdtpoaohygrgdfzgzjxmyongbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872498.3998618-786-113313171806030/AnsiballZ_command.py'
Oct 07 21:28:18 compute-0 sudo[72842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:18 compute-0 python3.9[72844]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:28:18 compute-0 sudo[72842]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:20 compute-0 sudo[73001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmbsvkitibxthrsyyrqsuafgdwaeqccv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872499.382273-802-249975348221846/AnsiballZ_blockinfile.py'
Oct 07 21:28:20 compute-0 sudo[73001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:20 compute-0 python3.9[73003]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:20 compute-0 sudo[73001]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:20 compute-0 sudo[73154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifukslwgzevoesxwmvytuykgzxvksohw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872500.639942-820-137564059631401/AnsiballZ_file.py'
Oct 07 21:28:20 compute-0 sudo[73154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:21 compute-0 python3.9[73156]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:21 compute-0 sudo[73154]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:21 compute-0 sudo[73306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsnytrsyaeckqdeifagkrgebckofhzzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872501.3873186-820-95548146673348/AnsiballZ_file.py'
Oct 07 21:28:21 compute-0 sudo[73306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:21 compute-0 python3.9[73308]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:21 compute-0 sudo[73306]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:22 compute-0 sudo[73458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehulgowahicyjjfihzkzdstnmbtcboqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872502.2562668-850-200453688448285/AnsiballZ_mount.py'
Oct 07 21:28:22 compute-0 sudo[73458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:22 compute-0 python3.9[73460]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 07 21:28:22 compute-0 sudo[73458]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:22 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:28:22 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:28:23 compute-0 sudo[73612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juxlznmdklythiwnpvmheffxppoydidw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872503.1707897-850-37419765031746/AnsiballZ_mount.py'
Oct 07 21:28:23 compute-0 sudo[73612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:23 compute-0 python3.9[73614]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 07 21:28:23 compute-0 sudo[73612]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:23 compute-0 sshd-session[72363]: Invalid user guest from 116.110.151.5 port 49874
Oct 07 21:28:24 compute-0 sshd-session[72363]: Connection closed by invalid user guest 116.110.151.5 port 49874 [preauth]
Oct 07 21:28:24 compute-0 sshd-session[65782]: Connection closed by 192.168.122.30 port 50902
Oct 07 21:28:24 compute-0 sshd-session[65779]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:28:24 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Oct 07 21:28:24 compute-0 systemd[1]: session-16.scope: Consumed 33.316s CPU time.
Oct 07 21:28:24 compute-0 systemd-logind[798]: Session 16 logged out. Waiting for processes to exit.
Oct 07 21:28:24 compute-0 systemd-logind[798]: Removed session 16.
Oct 07 21:28:24 compute-0 chronyd[61017]: Selected source 207.34.48.31 (pool.ntp.org)
Oct 07 21:28:29 compute-0 sshd-session[73640]: Accepted publickey for zuul from 192.168.122.30 port 59368 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:28:29 compute-0 systemd-logind[798]: New session 17 of user zuul.
Oct 07 21:28:29 compute-0 systemd[1]: Started Session 17 of User zuul.
Oct 07 21:28:29 compute-0 sshd-session[73640]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:28:30 compute-0 sudo[73793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fonjlrbhdykzewnrtxonyfzgsiihwqqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872509.7104945-17-81165207086681/AnsiballZ_tempfile.py'
Oct 07 21:28:30 compute-0 sudo[73793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:30 compute-0 python3.9[73795]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 07 21:28:30 compute-0 sudo[73793]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:30 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 07 21:28:31 compute-0 sudo[73947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxicjcoljjwlkrdhecgtpjrszebxfkqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872510.770137-41-85202831515424/AnsiballZ_stat.py'
Oct 07 21:28:31 compute-0 sudo[73947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:31 compute-0 python3.9[73949]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:28:31 compute-0 sudo[73947]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:32 compute-0 sudo[74099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdijnycnfscivfdfdclvbnnpktdwjnes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872511.8721848-61-227570816985496/AnsiballZ_setup.py'
Oct 07 21:28:32 compute-0 sudo[74099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:32 compute-0 python3.9[74101]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:28:32 compute-0 sudo[74099]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:33 compute-0 sudo[74251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cppimheqadzqxcwwkugwyhmqgulnfbhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872513.1113055-78-50908595552446/AnsiballZ_blockinfile.py'
Oct 07 21:28:33 compute-0 sudo[74251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:33 compute-0 python3.9[74253]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDL3y7HszyqE6RyPnvaoG7H41lJQy5MYkcAcMEZxKRA4jlXdlnD+g4/LgwSyl5YFksRAFEKrOs65lGkrDwnSwh9CiU8h2vJ5MdhithDwMv+QMMujT0VFJs0lepRJJy8wENSZuVcCg2RoLbfAWOC27tDfXwLnssrCe0ZEyP5CZjd6Pu6tv0xbaLi51sdWTa+hmHcj4peic5mqOO6B9n1sSle9QDnahzudUFBy+7SB6bKkd67ggdTjCH2YrpfifpFyY/GSMsYTBBcarySwTddJy4eLyux8Z5VZhzsEkyQ3HhE5VPWfFVOTnmNCHiY8PRgg882uXAoJ0IZtvlvPXVFmvZMa2/r+Wod5Sc/7RbI1NllpNBOa7LdUCrhJD7NzBWR/jS036lY7tf9phzlQfKU/qtpHVLyV2A9SmIUcdP39l8BdVrQlChn1YPN6kro44BzT7hPm9VXQbTfdH7ELKG2NLpvaf1DfT1pEE5r9HOknmyDtkdVv1pfbr/syH/bN0Ezuq0=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIvbB62scXgubRkbSqMbdGZt0LghvMLtNtKfbj6Au8Ia
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBH8wNRkNjX4tB4OS2KA5ORrPjxBBv/yW5YGyTH8q+HX6LKE3bgt889cTBPxVJmvOn9tedK2ia6aHIoBiTccdm0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVX14dKcurQDRklQblAo/nK4bVEFlR0Ayn7Hgn+Wyvjl7x/iNpfF0IzdOi/RVWZT6m2tAlMiOGKH5AaJIb2CWWObNqFzV0fdTcOj6kOpkDHoyj3Dp3SwZFCqFeHZ50gR1WbfUaBMvDJiExyscpwc6uvYA7X6io1C1jND1MKUdHgEdhsLCDcu8z00iQsxRKSEEyDhoQLa5fmyXucUjt+c4ztEp17CkAS4erx6X7OMhsLLvpeP02LwV8hQagwtmEbvieRShQJCRfap1pw2xmUwnC/ghjDelCRiHkUcpLr4+OkSqw523zRBH15RtCLC3dfEBIzxJL1mrO9h4p0kBh1b1okk4smwcNALwNWAAt1c5LbpPi0mu3DK2GqWQSb9mSXNKKWGA2zWcX15zyE5MXlLfn3Pi/DdlgzhwCx8YqPvIrE0UHZyNNKW91c/HV7PwX3yST0akkRtdbd0e7ZOmWUFe+YaIpVvpOZObfYe2LZILh/wbBRS79BBDIGEkm5T/uBBU=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCsmrx2/lwt3R8qOcnhTbR7+wLlJ8/1lWIfhwpYL6cy
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNoAo4j9tbQOxNgIpJpvAJVtk+BapSqAwLMEPdkqoBo0Jz3jf8iTMUv8Kb73DFLnBhwjniJxPINFSmz+1VwnpHw=
                                             create=True mode=0644 path=/tmp/ansible.un77z5pd state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:33 compute-0 sudo[74251]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:34 compute-0 sudo[74403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrfejkusvymoueeyqhrlxtouhuefvwdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872514.0819025-94-123359062504833/AnsiballZ_command.py'
Oct 07 21:28:34 compute-0 sudo[74403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:34 compute-0 python3.9[74405]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.un77z5pd' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:28:34 compute-0 sudo[74403]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:35 compute-0 sudo[74557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjpzwcmxmjnqmnzksmpbdzxfdyjltxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872515.2226255-110-2454031612696/AnsiballZ_file.py'
Oct 07 21:28:35 compute-0 sudo[74557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:35 compute-0 python3.9[74559]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.un77z5pd state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:35 compute-0 sudo[74557]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:36 compute-0 sshd-session[73643]: Connection closed by 192.168.122.30 port 59368
Oct 07 21:28:36 compute-0 sshd-session[73640]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:28:36 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Oct 07 21:28:36 compute-0 systemd[1]: session-17.scope: Consumed 3.546s CPU time.
Oct 07 21:28:36 compute-0 systemd-logind[798]: Session 17 logged out. Waiting for processes to exit.
Oct 07 21:28:36 compute-0 systemd-logind[798]: Removed session 17.
Oct 07 21:28:41 compute-0 sshd-session[74584]: Accepted publickey for zuul from 192.168.122.30 port 37702 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:28:41 compute-0 systemd-logind[798]: New session 18 of user zuul.
Oct 07 21:28:41 compute-0 systemd[1]: Started Session 18 of User zuul.
Oct 07 21:28:41 compute-0 sshd-session[74584]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:28:43 compute-0 python3.9[74737]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:28:44 compute-0 sudo[74891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hswggnmyuhokjkglcgelsbspxaxycsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872523.6955814-44-208568780719471/AnsiballZ_systemd.py'
Oct 07 21:28:44 compute-0 sudo[74891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:44 compute-0 python3.9[74893]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 07 21:28:44 compute-0 sudo[74891]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:45 compute-0 sudo[75045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmfjneozbwmwidgzhxmfrcwtmhivtsni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872525.0057774-60-201178912866052/AnsiballZ_systemd.py'
Oct 07 21:28:45 compute-0 sudo[75045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:45 compute-0 python3.9[75047]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:28:45 compute-0 sudo[75045]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:46 compute-0 sudo[75198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdjcetztwaquyqjqrzwksnzrrhsvjsly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872526.0968425-78-203850033530010/AnsiballZ_command.py'
Oct 07 21:28:46 compute-0 sudo[75198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:46 compute-0 python3.9[75200]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:28:46 compute-0 sudo[75198]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:47 compute-0 sudo[75351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnrnyqfdsanfuopfmkrpelpgswswotwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872527.2056866-94-255054159559443/AnsiballZ_stat.py'
Oct 07 21:28:47 compute-0 sudo[75351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:47 compute-0 python3.9[75353]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:28:47 compute-0 sudo[75351]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:48 compute-0 sudo[75505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gojvopasnyqmmayhekpmwmkgovybljyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872528.1603289-110-167431590305845/AnsiballZ_command.py'
Oct 07 21:28:48 compute-0 sudo[75505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:48 compute-0 python3.9[75507]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:28:48 compute-0 sudo[75505]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:49 compute-0 sudo[75660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bztwefyhsksojglmoyvgbyveslpuycke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872529.0260167-126-169198827167058/AnsiballZ_file.py'
Oct 07 21:28:49 compute-0 sudo[75660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:49 compute-0 python3.9[75662]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:28:49 compute-0 sudo[75660]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:50 compute-0 sshd-session[74587]: Connection closed by 192.168.122.30 port 37702
Oct 07 21:28:50 compute-0 sshd-session[74584]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:28:50 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Oct 07 21:28:50 compute-0 systemd[1]: session-18.scope: Consumed 4.695s CPU time.
Oct 07 21:28:50 compute-0 systemd-logind[798]: Session 18 logged out. Waiting for processes to exit.
Oct 07 21:28:50 compute-0 systemd-logind[798]: Removed session 18.
Oct 07 21:28:56 compute-0 sshd-session[75687]: Accepted publickey for zuul from 192.168.122.30 port 58478 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:28:56 compute-0 systemd-logind[798]: New session 19 of user zuul.
Oct 07 21:28:56 compute-0 systemd[1]: Started Session 19 of User zuul.
Oct 07 21:28:56 compute-0 sshd-session[75687]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:28:57 compute-0 python3.9[75840]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:28:58 compute-0 sudo[75994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhjsjispopjathqojqywtlfyazieesbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872537.7456024-48-20463859836533/AnsiballZ_setup.py'
Oct 07 21:28:58 compute-0 sudo[75994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:58 compute-0 python3.9[75996]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:28:58 compute-0 sudo[75994]: pam_unix(sudo:session): session closed for user root
Oct 07 21:28:58 compute-0 sudo[76078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqhqcjgyvmaqheczubxejnqyovpbcfcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872537.7456024-48-20463859836533/AnsiballZ_dnf.py'
Oct 07 21:28:58 compute-0 sudo[76078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:28:59 compute-0 python3.9[76080]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 07 21:29:00 compute-0 sudo[76078]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:01 compute-0 python3.9[76231]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:29:03 compute-0 python3.9[76382]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 07 21:29:04 compute-0 python3.9[76532]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:29:04 compute-0 python3.9[76682]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:29:05 compute-0 sshd-session[75690]: Connection closed by 192.168.122.30 port 58478
Oct 07 21:29:05 compute-0 sshd-session[75687]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:29:05 compute-0 systemd-logind[798]: Session 19 logged out. Waiting for processes to exit.
Oct 07 21:29:05 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Oct 07 21:29:05 compute-0 systemd[1]: session-19.scope: Consumed 5.697s CPU time.
Oct 07 21:29:05 compute-0 systemd-logind[798]: Removed session 19.
Oct 07 21:29:11 compute-0 sshd-session[76707]: Accepted publickey for zuul from 192.168.122.30 port 34942 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:29:11 compute-0 systemd-logind[798]: New session 20 of user zuul.
Oct 07 21:29:11 compute-0 systemd[1]: Started Session 20 of User zuul.
Oct 07 21:29:11 compute-0 sshd-session[76707]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:29:12 compute-0 python3.9[76860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:29:14 compute-0 sudo[77014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eakrkcetwznbshtmodpiifkadgawpmhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872553.579689-81-9628623304718/AnsiballZ_file.py'
Oct 07 21:29:14 compute-0 sudo[77014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:14 compute-0 python3.9[77016]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:14 compute-0 sudo[77014]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:14 compute-0 sudo[77166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsqowbtgjdkxixurynugaiqzyemnbexu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872554.375295-81-62468578516764/AnsiballZ_file.py'
Oct 07 21:29:14 compute-0 sudo[77166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:14 compute-0 python3.9[77168]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:14 compute-0 sudo[77166]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:15 compute-0 sudo[77318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxewtrxehejacfuxefgkricxqhueiobg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872555.071229-109-267267193954588/AnsiballZ_stat.py'
Oct 07 21:29:15 compute-0 sudo[77318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:15 compute-0 python3.9[77320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:15 compute-0 sudo[77318]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:16 compute-0 sudo[77441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlflupxccxgnmjkszznyaiopuwiyrsqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872555.071229-109-267267193954588/AnsiballZ_copy.py'
Oct 07 21:29:16 compute-0 sudo[77441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:16 compute-0 python3.9[77443]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872555.071229-109-267267193954588/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=6a88ae7cd039f177840456de94119e92b1204a8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:16 compute-0 sudo[77441]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:17 compute-0 sudo[77593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aolnkxufeekflhscisdkpsuowddwigxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872556.6984458-109-22767550078942/AnsiballZ_stat.py'
Oct 07 21:29:17 compute-0 sudo[77593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:17 compute-0 python3.9[77595]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:17 compute-0 sudo[77593]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:17 compute-0 sudo[77716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvuxswzzdtffmpfohvogoxtvjdwflrjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872556.6984458-109-22767550078942/AnsiballZ_copy.py'
Oct 07 21:29:17 compute-0 sudo[77716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:17 compute-0 python3.9[77718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872556.6984458-109-22767550078942/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=9bd8dec0f6a9146665036e322c409d007fec0eac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:17 compute-0 sudo[77716]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:18 compute-0 sudo[77868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwpzmfmfoqrrpsptebzdeibwfparaznx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872557.8318343-109-134314687237067/AnsiballZ_stat.py'
Oct 07 21:29:18 compute-0 sudo[77868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:18 compute-0 python3.9[77870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:18 compute-0 sudo[77868]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:18 compute-0 sudo[77991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qblrdfkoeoupsrohentbahuufucmymkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872557.8318343-109-134314687237067/AnsiballZ_copy.py'
Oct 07 21:29:18 compute-0 sudo[77991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:18 compute-0 python3.9[77993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872557.8318343-109-134314687237067/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d83322bacbecaf143edb12dca332dea55380bc53 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:18 compute-0 sudo[77991]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:19 compute-0 sudo[78143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qezlrccqwboilrlhwbkarupdhbvwixop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872559.1182826-189-140331837970384/AnsiballZ_file.py'
Oct 07 21:29:19 compute-0 sudo[78143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:19 compute-0 python3.9[78145]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:19 compute-0 sudo[78143]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:20 compute-0 sudo[78295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrwjdfjwoczjuqgpgpkyjpqxmdufiiyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872559.893035-189-78486717325734/AnsiballZ_file.py'
Oct 07 21:29:20 compute-0 sudo[78295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:20 compute-0 python3.9[78297]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:20 compute-0 sudo[78295]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:21 compute-0 sudo[78447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uubhqbspxsqycxbecnqoifyywlwmezsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872560.7140932-220-144065012581754/AnsiballZ_stat.py'
Oct 07 21:29:21 compute-0 sudo[78447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:21 compute-0 python3.9[78449]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:21 compute-0 sudo[78447]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:21 compute-0 sudo[78570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhwerzjlzbtdajmhziqskavfaicuatii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872560.7140932-220-144065012581754/AnsiballZ_copy.py'
Oct 07 21:29:21 compute-0 sudo[78570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:21 compute-0 python3.9[78572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872560.7140932-220-144065012581754/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c1ffe494ee70d98ae7086014d1e04b896638c1f1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:21 compute-0 sudo[78570]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:22 compute-0 sudo[78722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhwiuukfrdrimwgisvafbdlopfzsyveh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872562.105625-220-35619542871642/AnsiballZ_stat.py'
Oct 07 21:29:22 compute-0 sudo[78722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:22 compute-0 python3.9[78724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:22 compute-0 sudo[78722]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:23 compute-0 sudo[78845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnxqseryadduxobvhpvztngodqqogqfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872562.105625-220-35619542871642/AnsiballZ_copy.py'
Oct 07 21:29:23 compute-0 sudo[78845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:23 compute-0 python3.9[78847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872562.105625-220-35619542871642/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=09a827476b776ac1b07a59e309fa0b734a7cc9b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:23 compute-0 sudo[78845]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:23 compute-0 sudo[78997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuirwgnbfcwinraajcasyocdvswksixd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872563.5404668-220-161917681493867/AnsiballZ_stat.py'
Oct 07 21:29:23 compute-0 sudo[78997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:24 compute-0 python3.9[78999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:24 compute-0 sudo[78997]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:24 compute-0 sudo[79120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otgeaggborhebwwkgtxwtdqwltfnajwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872563.5404668-220-161917681493867/AnsiballZ_copy.py'
Oct 07 21:29:24 compute-0 sudo[79120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:24 compute-0 python3.9[79122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872563.5404668-220-161917681493867/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3ae88a0269e8ea66accb74921cf0294a138baef9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:24 compute-0 sudo[79120]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:25 compute-0 sudo[79272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtsjcwnejuxxbppxhqiuyymcbuiyebnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872564.8797593-300-132161831270469/AnsiballZ_file.py'
Oct 07 21:29:25 compute-0 sudo[79272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:25 compute-0 python3.9[79274]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:25 compute-0 sudo[79272]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:25 compute-0 sudo[79424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziuwaxvtkwhielmtehhwsozlppfoalve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872565.4282658-300-90477492711378/AnsiballZ_file.py'
Oct 07 21:29:25 compute-0 sudo[79424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:25 compute-0 python3.9[79426]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:25 compute-0 sudo[79424]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:26 compute-0 sudo[79576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifqtwxxrvontktjcxezsvoepjrsmyxpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872566.0493927-328-226399705919988/AnsiballZ_stat.py'
Oct 07 21:29:26 compute-0 sudo[79576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:26 compute-0 python3.9[79578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:26 compute-0 sudo[79576]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:26 compute-0 sudo[79699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isyadgafyhufcviomwrvvplfggdkyjwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872566.0493927-328-226399705919988/AnsiballZ_copy.py'
Oct 07 21:29:26 compute-0 sudo[79699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:27 compute-0 python3.9[79701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872566.0493927-328-226399705919988/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=a6562892f80f58ed3fef65ca0814d2e351c48c10 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:27 compute-0 sudo[79699]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:27 compute-0 sudo[79851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amihnppibsmctkyrtdisqasxzvhybhlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872567.2020917-328-280367239814437/AnsiballZ_stat.py'
Oct 07 21:29:27 compute-0 sudo[79851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:27 compute-0 python3.9[79853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:27 compute-0 sudo[79851]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:28 compute-0 sudo[79974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkunzbeovyxpvpgwviqqjqxnnovolgeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872567.2020917-328-280367239814437/AnsiballZ_copy.py'
Oct 07 21:29:28 compute-0 sudo[79974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:28 compute-0 python3.9[79976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872567.2020917-328-280367239814437/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e9ccf526176ad75ba5b283e0050f5597ff38563f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:28 compute-0 sudo[79974]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:28 compute-0 sudo[80126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngwelxxztoyheezcvlsxxuneywvspdcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872568.4581883-328-116519049730616/AnsiballZ_stat.py'
Oct 07 21:29:28 compute-0 sudo[80126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:28 compute-0 python3.9[80128]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:28 compute-0 sudo[80126]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:29 compute-0 sudo[80249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpipsuknouztlbbrekztvjaltyxzrthk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872568.4581883-328-116519049730616/AnsiballZ_copy.py'
Oct 07 21:29:29 compute-0 sudo[80249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:29 compute-0 python3.9[80251]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872568.4581883-328-116519049730616/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1ea2414aed38aef62ba0e455792f49f5206d14a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:29 compute-0 sudo[80249]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:30 compute-0 sudo[80401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inuokvnptiginswhpdrlkcfjiiqmyvai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872569.7377841-404-24227751612176/AnsiballZ_file.py'
Oct 07 21:29:30 compute-0 sudo[80401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:30 compute-0 python3.9[80403]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:30 compute-0 sudo[80401]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:30 compute-0 sudo[80555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcokqtmuyftqvwaxlccjdgzejuvinmrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872570.3791816-404-24297566062023/AnsiballZ_file.py'
Oct 07 21:29:30 compute-0 sudo[80555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:30 compute-0 python3.9[80557]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:30 compute-0 sudo[80555]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:31 compute-0 sudo[80707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdamvwuyawsbcehazqdflerzpwndzbfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872571.1666424-433-34026736362191/AnsiballZ_stat.py'
Oct 07 21:29:31 compute-0 sudo[80707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:31 compute-0 python3.9[80709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:31 compute-0 sudo[80707]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:31 compute-0 sudo[80830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczwmplbfiootqxxznxbymbjlpaidqyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872571.1666424-433-34026736362191/AnsiballZ_copy.py'
Oct 07 21:29:31 compute-0 sudo[80830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:32 compute-0 python3.9[80832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872571.1666424-433-34026736362191/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=440d1d8539090c96c754d216af033f0494689834 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:32 compute-0 sudo[80830]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:32 compute-0 sudo[80982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzctqbbtielaioyhcafhcwpvskdlguxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872572.3206959-433-180322934702703/AnsiballZ_stat.py'
Oct 07 21:29:32 compute-0 sudo[80982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:32 compute-0 python3.9[80984]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:32 compute-0 sudo[80982]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:33 compute-0 sudo[81105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qextltgqhvnvmlolxiphcrvnvbnmprcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872572.3206959-433-180322934702703/AnsiballZ_copy.py'
Oct 07 21:29:33 compute-0 sudo[81105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:33 compute-0 sshd-session[80522]: Invalid user admin from 116.110.151.5 port 54400
Oct 07 21:29:33 compute-0 python3.9[81107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872572.3206959-433-180322934702703/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e9ccf526176ad75ba5b283e0050f5597ff38563f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:33 compute-0 sudo[81105]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:33 compute-0 sshd-session[80522]: Connection closed by invalid user admin 116.110.151.5 port 54400 [preauth]
Oct 07 21:29:33 compute-0 sudo[81257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbkezkrjqvfjnuupgakyzzmsejdluxsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872573.5767672-433-151469610807250/AnsiballZ_stat.py'
Oct 07 21:29:33 compute-0 sudo[81257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:34 compute-0 python3.9[81259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:34 compute-0 sudo[81257]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:34 compute-0 sudo[81380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucqlzgukoufaeurpftdzosedumytmdwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872573.5767672-433-151469610807250/AnsiballZ_copy.py'
Oct 07 21:29:34 compute-0 sudo[81380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:34 compute-0 python3.9[81382]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872573.5767672-433-151469610807250/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f547a82779811c010717a2538d055304150d9a8e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:34 compute-0 sudo[81380]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:36 compute-0 sudo[81532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewqcovovnrszrijhaytfcustkaemklak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872575.6647527-536-255646994106220/AnsiballZ_file.py'
Oct 07 21:29:36 compute-0 sudo[81532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:36 compute-0 python3.9[81534]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:36 compute-0 sudo[81532]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:36 compute-0 sudo[81684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzcjofnnnhyqeammnhrjsfzmmyzutcur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872576.3979614-548-94113551414001/AnsiballZ_stat.py'
Oct 07 21:29:36 compute-0 sudo[81684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:36 compute-0 python3.9[81686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:36 compute-0 sudo[81684]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:37 compute-0 sudo[81807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akublkujnuaccfltmffwrpcabbbbdyos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872576.3979614-548-94113551414001/AnsiballZ_copy.py'
Oct 07 21:29:37 compute-0 sudo[81807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:37 compute-0 python3.9[81809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872576.3979614-548-94113551414001/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:37 compute-0 sudo[81807]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:38 compute-0 sudo[81959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifdpmmjblpiprxstjxxnovpaczjhwbld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872577.7753274-576-119962172386059/AnsiballZ_file.py'
Oct 07 21:29:38 compute-0 sudo[81959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:38 compute-0 python3.9[81961]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:38 compute-0 sudo[81959]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:38 compute-0 sudo[82111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvomqgrsdryaoumeswnrsfjvvzmsdtcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872578.5951016-591-218438889275311/AnsiballZ_stat.py'
Oct 07 21:29:38 compute-0 sudo[82111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:39 compute-0 python3.9[82113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:39 compute-0 sudo[82111]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:39 compute-0 sudo[82234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxthupgchxnhzhxtvurzsmwleyaohimf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872578.5951016-591-218438889275311/AnsiballZ_copy.py'
Oct 07 21:29:39 compute-0 sudo[82234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:39 compute-0 python3.9[82236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872578.5951016-591-218438889275311/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:39 compute-0 sudo[82234]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:40 compute-0 sudo[82386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdoceawnopfgmttjpkxdbrxqdnjhvbjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872580.0346196-630-81369860034183/AnsiballZ_file.py'
Oct 07 21:29:40 compute-0 sudo[82386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:40 compute-0 python3.9[82388]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:40 compute-0 sudo[82386]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:41 compute-0 sudo[82538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzbykabvgofujwbucrymmozaszwsortn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872580.7367647-647-70646624853483/AnsiballZ_stat.py'
Oct 07 21:29:41 compute-0 sudo[82538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:41 compute-0 python3.9[82540]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:41 compute-0 sudo[82538]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:41 compute-0 sudo[82661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfhfuksvzyyfrmkieehcnkudkqucexfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872580.7367647-647-70646624853483/AnsiballZ_copy.py'
Oct 07 21:29:41 compute-0 sudo[82661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:41 compute-0 python3.9[82663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872580.7367647-647-70646624853483/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:41 compute-0 sudo[82661]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:42 compute-0 sudo[82813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jknmmefxkvypkjjhchhftqnszvroosjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872582.0783577-676-71170078337882/AnsiballZ_file.py'
Oct 07 21:29:42 compute-0 sudo[82813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:42 compute-0 python3.9[82815]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:42 compute-0 sudo[82813]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:43 compute-0 sudo[82965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flyivqttxxvhsvohkguvsbodiiygnwzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872582.834449-690-230557257847978/AnsiballZ_stat.py'
Oct 07 21:29:43 compute-0 sudo[82965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:43 compute-0 python3.9[82967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:43 compute-0 sudo[82965]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:43 compute-0 sudo[83088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqdadgkzfoaxbfsbyldvclsfxkuuqsoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872582.834449-690-230557257847978/AnsiballZ_copy.py'
Oct 07 21:29:43 compute-0 sudo[83088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:44 compute-0 python3.9[83090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872582.834449-690-230557257847978/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:44 compute-0 sudo[83088]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:44 compute-0 sudo[83240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmciijrcuduyohnbpawdcyocnmdvcgew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872584.396171-719-250481033194882/AnsiballZ_file.py'
Oct 07 21:29:44 compute-0 sudo[83240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:44 compute-0 python3.9[83242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:44 compute-0 sudo[83240]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:45 compute-0 sudo[83392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zikaodmidxkhsuxltavjplrzytukcfnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872585.1468635-735-270546507028344/AnsiballZ_stat.py'
Oct 07 21:29:45 compute-0 sudo[83392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:45 compute-0 python3.9[83394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:45 compute-0 sudo[83392]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:46 compute-0 sudo[83515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lewdrhqorifkwsqjwpbfupvhatvfrwfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872585.1468635-735-270546507028344/AnsiballZ_copy.py'
Oct 07 21:29:46 compute-0 sudo[83515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:46 compute-0 python3.9[83517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872585.1468635-735-270546507028344/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:46 compute-0 sudo[83515]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:46 compute-0 sudo[83667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrguigjtoblbxrimubpsgimuevopxglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872586.528926-767-152167696328313/AnsiballZ_file.py'
Oct 07 21:29:46 compute-0 sudo[83667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:47 compute-0 python3.9[83669]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:47 compute-0 sudo[83667]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:47 compute-0 sudo[83819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsbyhlrbovwlgzbnynqbrlfknwyhzcpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872587.1956165-782-252288798419254/AnsiballZ_stat.py'
Oct 07 21:29:47 compute-0 sudo[83819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:47 compute-0 python3.9[83821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:47 compute-0 sudo[83819]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:48 compute-0 sudo[83942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhcactzmqwvhiciyrlngmszcboftnlpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872587.1956165-782-252288798419254/AnsiballZ_copy.py'
Oct 07 21:29:48 compute-0 sudo[83942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:48 compute-0 python3.9[83944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872587.1956165-782-252288798419254/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:48 compute-0 sudo[83942]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:48 compute-0 sudo[84094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcwubajzqsuhcjetnfgzicwzqnycirce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872588.5481296-808-281197283475398/AnsiballZ_file.py'
Oct 07 21:29:48 compute-0 sudo[84094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:49 compute-0 python3.9[84096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:29:49 compute-0 sudo[84094]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:49 compute-0 sudo[84248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbsgellrmramswubpguntlekbkzgtoie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872589.4199429-825-424081297593/AnsiballZ_stat.py'
Oct 07 21:29:49 compute-0 sudo[84248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:50 compute-0 python3.9[84250]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:29:50 compute-0 sudo[84248]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:50 compute-0 sudo[84371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdvvbkqonqvszmwjeebitkaojylsybz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872589.4199429-825-424081297593/AnsiballZ_copy.py'
Oct 07 21:29:50 compute-0 sudo[84371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:29:50 compute-0 python3.9[84373]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872589.4199429-825-424081297593/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8f107cf58c6c519943bb67ea5517de98604df546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:29:50 compute-0 sudo[84371]: pam_unix(sudo:session): session closed for user root
Oct 07 21:29:55 compute-0 sshd-session[84097]: Invalid user admin from 116.110.151.5 port 47798
Oct 07 21:29:56 compute-0 sshd-session[84097]: Connection closed by invalid user admin 116.110.151.5 port 47798 [preauth]
Oct 07 21:29:57 compute-0 sshd-session[76710]: Connection closed by 192.168.122.30 port 34942
Oct 07 21:29:57 compute-0 sshd-session[76707]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:29:57 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Oct 07 21:29:57 compute-0 systemd[1]: session-20.scope: Consumed 30.578s CPU time.
Oct 07 21:29:57 compute-0 systemd-logind[798]: Session 20 logged out. Waiting for processes to exit.
Oct 07 21:29:57 compute-0 systemd-logind[798]: Removed session 20.
Oct 07 21:30:03 compute-0 PackageKit[31462]: daemon quit
Oct 07 21:30:03 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 07 21:30:03 compute-0 sshd-session[84400]: Accepted publickey for zuul from 192.168.122.30 port 50602 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:30:03 compute-0 systemd-logind[798]: New session 21 of user zuul.
Oct 07 21:30:03 compute-0 systemd[1]: Started Session 21 of User zuul.
Oct 07 21:30:03 compute-0 sshd-session[84400]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:30:04 compute-0 sshd-session[84398]: Invalid user test from 116.110.151.5 port 34242
Oct 07 21:30:04 compute-0 python3.9[84553]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:30:06 compute-0 sudo[84707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgyzsxdddwrkbyuvmbpjgtehnteczdjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872605.3802202-48-47799486740427/AnsiballZ_file.py'
Oct 07 21:30:06 compute-0 sudo[84707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:06 compute-0 python3.9[84709]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:30:06 compute-0 sudo[84707]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:06 compute-0 sudo[84859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inqfclvgjhrrjxfoehhxskvudamwgcee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872606.3721626-48-132021985431466/AnsiballZ_file.py'
Oct 07 21:30:06 compute-0 sudo[84859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:06 compute-0 python3.9[84861]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:30:06 compute-0 sudo[84859]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:07 compute-0 python3.9[85011]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:30:08 compute-0 sudo[85161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opjarpsecdrffdipewtmeqlceqdrfuxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872608.1791358-94-51205318913230/AnsiballZ_seboolean.py'
Oct 07 21:30:08 compute-0 sudo[85161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:08 compute-0 python3.9[85163]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 07 21:30:09 compute-0 sshd-session[84398]: Connection closed by invalid user test 116.110.151.5 port 34242 [preauth]
Oct 07 21:30:09 compute-0 sudo[85161]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:11 compute-0 sudo[85317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzmcbyqfzbhvkuaqcwezrhapfjrfqbpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872611.1785514-114-56128652138363/AnsiballZ_setup.py'
Oct 07 21:30:11 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 07 21:30:11 compute-0 sudo[85317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:11 compute-0 python3.9[85319]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:30:12 compute-0 sudo[85317]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:12 compute-0 sudo[85401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spblophjprdwuertstmdxrflngdoskjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872611.1785514-114-56128652138363/AnsiballZ_dnf.py'
Oct 07 21:30:12 compute-0 sudo[85401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:12 compute-0 python3.9[85403]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:30:14 compute-0 sudo[85401]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:15 compute-0 sudo[85554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufpxneknpyzspepipmzldhbyjfqhbmlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872614.4659686-138-126580301869434/AnsiballZ_systemd.py'
Oct 07 21:30:15 compute-0 sudo[85554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:15 compute-0 python3.9[85556]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:30:15 compute-0 sudo[85554]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:16 compute-0 sudo[85709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqytapdadmoziignhtukgobzszbhpugv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759872615.9308252-154-226544017244756/AnsiballZ_edpm_nftables_snippet.py'
Oct 07 21:30:16 compute-0 sudo[85709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:16 compute-0 python3[85711]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 07 21:30:16 compute-0 sudo[85709]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:17 compute-0 sudo[85861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxasnhbgqeyrzbzzsgzjbaqeejlyalnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872617.115732-172-267694118973162/AnsiballZ_file.py'
Oct 07 21:30:17 compute-0 sudo[85861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:17 compute-0 python3.9[85863]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:17 compute-0 sudo[85861]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:18 compute-0 sudo[86013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pskpuluemnvxzpfswdgvpkaljzbzptek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872617.9627805-188-241391692126410/AnsiballZ_stat.py'
Oct 07 21:30:18 compute-0 sudo[86013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:18 compute-0 python3.9[86015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:18 compute-0 sudo[86013]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:19 compute-0 sudo[86091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuyhdrvuzaiyqjfmiehzjxwxzjltleze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872617.9627805-188-241391692126410/AnsiballZ_file.py'
Oct 07 21:30:19 compute-0 sudo[86091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:19 compute-0 python3.9[86093]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:19 compute-0 sudo[86091]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:19 compute-0 sudo[86243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljqwuwhmkbfeistlghsbhiwqrmkqnisr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872619.4832745-212-122884379616056/AnsiballZ_stat.py'
Oct 07 21:30:19 compute-0 sudo[86243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:20 compute-0 python3.9[86245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:20 compute-0 sudo[86243]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:20 compute-0 sudo[86321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyugwjxzewlkxgoxijzxousphdplvqeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872619.4832745-212-122884379616056/AnsiballZ_file.py'
Oct 07 21:30:20 compute-0 sudo[86321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:20 compute-0 python3.9[86323]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hcubwc5h recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:20 compute-0 sudo[86321]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:21 compute-0 sudo[86473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eixkwsxdunercgeaxtgpirqfzoqatifz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872620.893983-236-249521816542996/AnsiballZ_stat.py'
Oct 07 21:30:21 compute-0 sudo[86473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:21 compute-0 python3.9[86475]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:21 compute-0 sudo[86473]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:21 compute-0 sudo[86551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gboapjganimkydzagsxpnvbvehrobeqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872620.893983-236-249521816542996/AnsiballZ_file.py'
Oct 07 21:30:21 compute-0 sudo[86551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:22 compute-0 python3.9[86553]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:22 compute-0 sudo[86551]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:22 compute-0 sudo[86703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokfkjlxbysqljymlccmnfkfmzvnuhvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872622.3051262-262-45944815182397/AnsiballZ_command.py'
Oct 07 21:30:22 compute-0 sudo[86703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:23 compute-0 python3.9[86705]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:23 compute-0 sudo[86703]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:23 compute-0 sudo[86856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkbhjoufpfaqwjnkgkjcnfmngumkkhlo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759872623.3707037-278-203690843276650/AnsiballZ_edpm_nftables_from_files.py'
Oct 07 21:30:23 compute-0 sudo[86856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:24 compute-0 python3[86858]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 07 21:30:24 compute-0 sudo[86856]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:24 compute-0 sudo[87008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxgfxjqruudowyquscciwsgpxaapmjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872624.427734-294-226787645137036/AnsiballZ_stat.py'
Oct 07 21:30:24 compute-0 sudo[87008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:25 compute-0 python3.9[87010]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:25 compute-0 sudo[87008]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:25 compute-0 sudo[87133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcgothqtkfhusrherrifvokhprehazcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872624.427734-294-226787645137036/AnsiballZ_copy.py'
Oct 07 21:30:25 compute-0 sudo[87133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:25 compute-0 python3.9[87135]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872624.427734-294-226787645137036/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:25 compute-0 sudo[87133]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:26 compute-0 sudo[87285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ackllsjqgsjtothsembyjwesomowulys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872626.2133484-324-7661600643548/AnsiballZ_stat.py'
Oct 07 21:30:26 compute-0 sudo[87285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:26 compute-0 python3.9[87287]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:26 compute-0 sudo[87285]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:27 compute-0 sudo[87410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwjuqdadxwretmcvnoymsdjwogpyzvpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872626.2133484-324-7661600643548/AnsiballZ_copy.py'
Oct 07 21:30:27 compute-0 sudo[87410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:27 compute-0 python3.9[87412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872626.2133484-324-7661600643548/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:27 compute-0 sudo[87410]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:28 compute-0 sudo[87562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhohtikyfsaacidwyvcokevmdjeqlqpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872627.856221-354-74983098668583/AnsiballZ_stat.py'
Oct 07 21:30:28 compute-0 sudo[87562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:28 compute-0 python3.9[87564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:28 compute-0 sudo[87562]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:28 compute-0 sudo[87687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaryvvtawcydfslaybttadbyllqxvuov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872627.856221-354-74983098668583/AnsiballZ_copy.py'
Oct 07 21:30:28 compute-0 sudo[87687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:29 compute-0 python3.9[87689]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872627.856221-354-74983098668583/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:29 compute-0 sudo[87687]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:30 compute-0 sudo[87839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sraiiamlppyrfvbqncyjnbqhwmenevzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872629.6079817-384-135004082163735/AnsiballZ_stat.py'
Oct 07 21:30:30 compute-0 sudo[87839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:30 compute-0 python3.9[87841]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:30 compute-0 sudo[87839]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:30 compute-0 sudo[87964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnkpynahhvkxammvwalpbdqnrgrmpmnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872629.6079817-384-135004082163735/AnsiballZ_copy.py'
Oct 07 21:30:30 compute-0 sudo[87964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:30 compute-0 python3.9[87966]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872629.6079817-384-135004082163735/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:30 compute-0 sudo[87964]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:31 compute-0 sudo[88116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxrxeknjpwdtbtdetpgsnkafhqoodixf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872631.3175042-414-218586605534198/AnsiballZ_stat.py'
Oct 07 21:30:31 compute-0 sudo[88116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:31 compute-0 python3.9[88118]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:31 compute-0 sudo[88116]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:32 compute-0 sudo[88241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epylutcbpbtonziersrsxinrccecgnmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872631.3175042-414-218586605534198/AnsiballZ_copy.py'
Oct 07 21:30:32 compute-0 sudo[88241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:32 compute-0 python3.9[88243]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872631.3175042-414-218586605534198/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:32 compute-0 sudo[88241]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:33 compute-0 sudo[88393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crsthgwgitzhtnlnpfxvgreodxqvtuqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872633.119843-444-198665412084644/AnsiballZ_file.py'
Oct 07 21:30:33 compute-0 sudo[88393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:33 compute-0 python3.9[88395]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:33 compute-0 sudo[88393]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:34 compute-0 sudo[88545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pomdjgsfsklwiovtdjjwqvbspgliknqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872634.0202394-460-142144545442471/AnsiballZ_command.py'
Oct 07 21:30:34 compute-0 sudo[88545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:34 compute-0 python3.9[88547]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:34 compute-0 sudo[88545]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:35 compute-0 sudo[88700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgryaflrzlerpakxlnkfyrcawsziamrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872634.9235144-476-231516597614900/AnsiballZ_blockinfile.py'
Oct 07 21:30:35 compute-0 sudo[88700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:35 compute-0 python3.9[88702]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:35 compute-0 sudo[88700]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:36 compute-0 sudo[88852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pamlmytjauzigqibncthqbutoahfvacu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872636.0053926-494-166751637917822/AnsiballZ_command.py'
Oct 07 21:30:36 compute-0 sudo[88852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:36 compute-0 python3.9[88854]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:36 compute-0 sudo[88852]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:37 compute-0 sudo[89005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgkvrqzopguqeybyzhqngzeihwyssdcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872636.9221313-510-215554713664775/AnsiballZ_stat.py'
Oct 07 21:30:37 compute-0 sudo[89005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:37 compute-0 python3.9[89007]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:30:37 compute-0 sudo[89005]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:37 compute-0 sudo[89159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhdxcwkyhrhgdobrsewrjullzxxptggx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872637.7192194-526-115728126433464/AnsiballZ_command.py'
Oct 07 21:30:37 compute-0 sudo[89159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:38 compute-0 python3.9[89161]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:38 compute-0 sudo[89159]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:38 compute-0 sudo[89314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpcggfltxjrjewoxlyfewcscywktttbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872638.6604497-542-146537402502801/AnsiballZ_file.py'
Oct 07 21:30:38 compute-0 sudo[89314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:39 compute-0 python3.9[89316]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:39 compute-0 sudo[89314]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:40 compute-0 python3.9[89466]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:30:41 compute-0 sudo[89617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riqfhtlohugzpebnjynppylzyklpynmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872641.3359642-622-64028905243080/AnsiballZ_command.py'
Oct 07 21:30:41 compute-0 sudo[89617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:41 compute-0 python3.9[89619]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:41 compute-0 ovs-vsctl[89620]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 07 21:30:41 compute-0 sudo[89617]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:42 compute-0 sudo[89770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwznihqsrfrqhxahsbzkyjpfuzdhpjzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872642.2961826-640-117133090525494/AnsiballZ_command.py'
Oct 07 21:30:42 compute-0 sudo[89770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:42 compute-0 python3.9[89772]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:42 compute-0 sudo[89770]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:43 compute-0 sudo[89925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djkodiewihepubudnzgnbktfgintpwjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872643.216025-656-251203063279085/AnsiballZ_command.py'
Oct 07 21:30:43 compute-0 sudo[89925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:43 compute-0 python3.9[89927]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:30:43 compute-0 ovs-vsctl[89928]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 07 21:30:43 compute-0 sudo[89925]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:44 compute-0 python3.9[90078]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:30:45 compute-0 sudo[90230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdnjjpuwtvsxgfeviyoyfrzaemfsgfkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872645.0372574-690-114847585550069/AnsiballZ_file.py'
Oct 07 21:30:45 compute-0 sudo[90230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:45 compute-0 python3.9[90232]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:30:45 compute-0 sudo[90230]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:46 compute-0 sudo[90382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkskinfxsizfmjhdkxptixgxejshqqch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872645.8910217-706-39628289839016/AnsiballZ_stat.py'
Oct 07 21:30:46 compute-0 sudo[90382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:46 compute-0 python3.9[90384]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:46 compute-0 sudo[90382]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:46 compute-0 sudo[90460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrzbbfhdosrdkzjokfqitfsfppajhexx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872645.8910217-706-39628289839016/AnsiballZ_file.py'
Oct 07 21:30:46 compute-0 sudo[90460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:46 compute-0 python3.9[90462]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:30:46 compute-0 sudo[90460]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:47 compute-0 sudo[90612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibqbleucsqyciissoypnftabcpsfzqxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872646.9618385-706-83035603706490/AnsiballZ_stat.py'
Oct 07 21:30:47 compute-0 sudo[90612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:47 compute-0 python3.9[90614]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:47 compute-0 sudo[90612]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:47 compute-0 sudo[90690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyebnmynqpgsojchklwikrijhlqxuabz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872646.9618385-706-83035603706490/AnsiballZ_file.py'
Oct 07 21:30:47 compute-0 sudo[90690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:47 compute-0 python3.9[90692]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:30:47 compute-0 sudo[90690]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:49 compute-0 sudo[90842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkgheinyimbnhohztqxpmnxxpdggelgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872648.7607455-752-92966311280222/AnsiballZ_file.py'
Oct 07 21:30:49 compute-0 sudo[90842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:49 compute-0 python3.9[90844]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:49 compute-0 sudo[90842]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:49 compute-0 sudo[90994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkkeamxhpfnethstldbvbqjsmfswnmmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872649.594191-768-241155198762652/AnsiballZ_stat.py'
Oct 07 21:30:49 compute-0 sudo[90994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:50 compute-0 python3.9[90996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:50 compute-0 sudo[90994]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:50 compute-0 sudo[91072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzosfphjgbmqivyusmuupgazgbspmcfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872649.594191-768-241155198762652/AnsiballZ_file.py'
Oct 07 21:30:50 compute-0 sudo[91072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:50 compute-0 python3.9[91074]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:50 compute-0 sudo[91072]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:51 compute-0 sudo[91225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftoxujbkkctpssspicjusafrwvskanhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872651.192513-792-38722686511200/AnsiballZ_stat.py'
Oct 07 21:30:51 compute-0 sudo[91225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:51 compute-0 python3.9[91227]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:51 compute-0 sudo[91225]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:52 compute-0 sudo[91303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmbellfxmrrbtrpcytytbrnrpphouzqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872651.192513-792-38722686511200/AnsiballZ_file.py'
Oct 07 21:30:52 compute-0 sudo[91303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:52 compute-0 python3.9[91305]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:52 compute-0 sudo[91303]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:53 compute-0 sudo[91455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvrjtxpfufnkwipbqfbnpqlubtfgqlec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872652.6883807-816-187648636391465/AnsiballZ_systemd.py'
Oct 07 21:30:53 compute-0 sudo[91455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:53 compute-0 python3.9[91457]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:30:53 compute-0 systemd[1]: Reloading.
Oct 07 21:30:53 compute-0 systemd-sysv-generator[91487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:30:53 compute-0 systemd-rc-local-generator[91479]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:30:53 compute-0 sudo[91455]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:54 compute-0 sudo[91644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llcpgezmkpzliufrqrvgjacovjwwfkvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872654.0062983-832-185075326227081/AnsiballZ_stat.py'
Oct 07 21:30:54 compute-0 sudo[91644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:54 compute-0 python3.9[91646]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:54 compute-0 sudo[91644]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:54 compute-0 sudo[91722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyopeylwshmohpcmwcpufetopqmvyxhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872654.0062983-832-185075326227081/AnsiballZ_file.py'
Oct 07 21:30:54 compute-0 sudo[91722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:55 compute-0 python3.9[91724]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:55 compute-0 sudo[91722]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:55 compute-0 sudo[91874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpopqxpqwydegzckvdrddzhdhsollynk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872655.5857828-856-61868952034420/AnsiballZ_stat.py'
Oct 07 21:30:55 compute-0 sudo[91874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:56 compute-0 python3.9[91876]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:56 compute-0 sudo[91874]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:56 compute-0 sudo[91952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smzslshlinzonbjljtnrqumrktdugsdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872655.5857828-856-61868952034420/AnsiballZ_file.py'
Oct 07 21:30:56 compute-0 sudo[91952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:56 compute-0 python3.9[91954]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:30:56 compute-0 sudo[91952]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:57 compute-0 sudo[92104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqwypbxqktqohotcpfqyuhihfayyojtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872657.0703104-880-58424755862331/AnsiballZ_systemd.py'
Oct 07 21:30:57 compute-0 sudo[92104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:57 compute-0 python3.9[92106]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:30:57 compute-0 systemd[1]: Reloading.
Oct 07 21:30:57 compute-0 systemd-rc-local-generator[92135]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:30:57 compute-0 systemd-sysv-generator[92139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:30:58 compute-0 systemd[1]: Starting Create netns directory...
Oct 07 21:30:58 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 07 21:30:58 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 07 21:30:58 compute-0 systemd[1]: Finished Create netns directory.
Oct 07 21:30:58 compute-0 sudo[92104]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:58 compute-0 sudo[92300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyrcoquktcbshbzxaahoqzykpuyzllvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872658.57418-900-132601412122029/AnsiballZ_file.py'
Oct 07 21:30:58 compute-0 sudo[92300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:59 compute-0 python3.9[92302]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:30:59 compute-0 sudo[92300]: pam_unix(sudo:session): session closed for user root
Oct 07 21:30:59 compute-0 sudo[92452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmxnjkeltlaylkmqqwlkvvsqrkwbzaer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872659.4459713-916-124398716837512/AnsiballZ_stat.py'
Oct 07 21:30:59 compute-0 sudo[92452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:30:59 compute-0 python3.9[92454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:30:59 compute-0 sudo[92452]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:00 compute-0 sudo[92575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgipweywfublyinbprlwxtfukzjgzwfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872659.4459713-916-124398716837512/AnsiballZ_copy.py'
Oct 07 21:31:00 compute-0 sudo[92575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:00 compute-0 python3.9[92577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872659.4459713-916-124398716837512/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:00 compute-0 sudo[92575]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:01 compute-0 sudo[92727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqvieocgwdyiyilcgurwlowylixrgouf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872661.2592273-950-254868512530223/AnsiballZ_file.py'
Oct 07 21:31:01 compute-0 sudo[92727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:01 compute-0 python3.9[92729]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:01 compute-0 sudo[92727]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:02 compute-0 sudo[92879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgoktwyohseilchrmaoycxriqmzcduhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872662.235891-966-274554279172536/AnsiballZ_stat.py'
Oct 07 21:31:02 compute-0 sudo[92879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:02 compute-0 python3.9[92881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:02 compute-0 sudo[92879]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:03 compute-0 sudo[93002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnhgmdpaocewthkfsjexrfcvrvrnauta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872662.235891-966-274554279172536/AnsiballZ_copy.py'
Oct 07 21:31:03 compute-0 sudo[93002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:03 compute-0 python3.9[93004]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872662.235891-966-274554279172536/.source.json _original_basename=.1dwyuyqk follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:03 compute-0 sudo[93002]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:04 compute-0 sudo[93154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puicliynaptzkyxwqddwrshtbuwznwkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872663.7881763-996-116042559693443/AnsiballZ_file.py'
Oct 07 21:31:04 compute-0 sudo[93154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:04 compute-0 python3.9[93156]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:04 compute-0 sudo[93154]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:05 compute-0 sudo[93306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bthgtxdjnexqyvuvcrvkmgfqdfvnwgmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872664.7259586-1012-99383983438851/AnsiballZ_stat.py'
Oct 07 21:31:05 compute-0 sudo[93306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:05 compute-0 sudo[93306]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:05 compute-0 sudo[93429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfvjkplnvhkpozyhfsljmirrlvurvdbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872664.7259586-1012-99383983438851/AnsiballZ_copy.py'
Oct 07 21:31:05 compute-0 sudo[93429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:05 compute-0 sudo[93429]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:07 compute-0 sudo[93581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttmgbaltnkzqmtcqlqvrdujhazbzvmpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872666.5726914-1046-238264075222636/AnsiballZ_container_config_data.py'
Oct 07 21:31:07 compute-0 sudo[93581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:07 compute-0 python3.9[93583]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 07 21:31:07 compute-0 sudo[93581]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:08 compute-0 sudo[93733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nddxkvycpqibvvcmiczgemefhdnlyarm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872667.7132943-1064-139193270065705/AnsiballZ_container_config_hash.py'
Oct 07 21:31:08 compute-0 sudo[93733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:08 compute-0 python3.9[93735]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:31:08 compute-0 sudo[93733]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:09 compute-0 sudo[93889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyctgrnfbswhuptpgcoleebyokmxfqxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872668.7473128-1082-179851884774925/AnsiballZ_podman_container_info.py'
Oct 07 21:31:09 compute-0 sudo[93889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:09 compute-0 python3.9[93891]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 07 21:31:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:31:09 compute-0 sudo[93889]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:10 compute-0 sshd-session[93811]: Invalid user manager from 103.115.24.11 port 51698
Oct 07 21:31:10 compute-0 sshd-session[93811]: Received disconnect from 103.115.24.11 port 51698:11: Bye Bye [preauth]
Oct 07 21:31:10 compute-0 sshd-session[93811]: Disconnected from invalid user manager 103.115.24.11 port 51698 [preauth]
Oct 07 21:31:10 compute-0 sudo[94052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fktyzeklhwmqhmtkahtykzjbxpwjlgga ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759872670.2161512-1108-184210494696758/AnsiballZ_edpm_container_manage.py'
Oct 07 21:31:10 compute-0 sudo[94052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:11 compute-0 python3[94054]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:31:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:31:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:31:11 compute-0 podman[94091]: 2025-10-07 21:31:11.266131688 +0000 UTC m=+0.064860101 container create 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 07 21:31:11 compute-0 podman[94091]: 2025-10-07 21:31:11.230653715 +0000 UTC m=+0.029382228 image pull 3f0eba8665aff2d8053ef7db64bd77093affef7b4125d116bb9a11adf927b8d7 38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 07 21:31:11 compute-0 python3[94054]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 07 21:31:11 compute-0 sudo[94052]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 07 21:31:12 compute-0 sudo[94279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atsnwjsozgpeyebbxkkwnsbiuqnbryde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872671.781835-1124-215405225764782/AnsiballZ_stat.py'
Oct 07 21:31:12 compute-0 sudo[94279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:12 compute-0 python3.9[94281]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:31:12 compute-0 sudo[94279]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:13 compute-0 sudo[94433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxjctsibrifibnwcuxuezhbmllcmzahk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872672.7377427-1142-97687805360010/AnsiballZ_file.py'
Oct 07 21:31:13 compute-0 sudo[94433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:13 compute-0 python3.9[94435]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:13 compute-0 sudo[94433]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:13 compute-0 sudo[94509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oecufaaxqvhhsffyfxpygwnqgqhlhovp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872672.7377427-1142-97687805360010/AnsiballZ_stat.py'
Oct 07 21:31:13 compute-0 sudo[94509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:13 compute-0 python3.9[94511]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:31:13 compute-0 sudo[94509]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:14 compute-0 sudo[94660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geipjtddldsnelsfyfifrohvkaukrymv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872673.77984-1142-43753980094670/AnsiballZ_copy.py'
Oct 07 21:31:14 compute-0 sudo[94660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:14 compute-0 python3.9[94662]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759872673.77984-1142-43753980094670/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:14 compute-0 sudo[94660]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:14 compute-0 sudo[94736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoybpfwiepjcgouaazvzmypjeotxhtzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872673.77984-1142-43753980094670/AnsiballZ_systemd.py'
Oct 07 21:31:14 compute-0 sudo[94736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:15 compute-0 python3.9[94738]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:31:15 compute-0 systemd[1]: Reloading.
Oct 07 21:31:15 compute-0 systemd-sysv-generator[94765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:31:15 compute-0 systemd-rc-local-generator[94761]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:31:15 compute-0 sudo[94736]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:15 compute-0 sudo[94847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swtczzklwzfjqzntuprzelrubwdtiuhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872673.77984-1142-43753980094670/AnsiballZ_systemd.py'
Oct 07 21:31:15 compute-0 sudo[94847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:16 compute-0 python3.9[94849]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:31:16 compute-0 systemd[1]: Reloading.
Oct 07 21:31:16 compute-0 systemd-rc-local-generator[94875]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:31:16 compute-0 systemd-sysv-generator[94879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:31:16 compute-0 systemd[1]: Starting ovn_controller container...
Oct 07 21:31:16 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 07 21:31:16 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:31:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4453b10fa13da0fe94fa05d34b87090af326de4fb79603b1ce7ea5f6acb7ab37/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 07 21:31:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec.
Oct 07 21:31:16 compute-0 podman[94889]: 2025-10-07 21:31:16.674467326 +0000 UTC m=+0.156354842 container init 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 07 21:31:16 compute-0 ovn_controller[94904]: + sudo -E kolla_set_configs
Oct 07 21:31:16 compute-0 podman[94889]: 2025-10-07 21:31:16.710369822 +0000 UTC m=+0.192257288 container start 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 21:31:16 compute-0 edpm-start-podman-container[94889]: ovn_controller
Oct 07 21:31:16 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 07 21:31:16 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 07 21:31:16 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 07 21:31:16 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 07 21:31:16 compute-0 systemd[94937]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 07 21:31:16 compute-0 edpm-start-podman-container[94888]: Creating additional drop-in dependency for "ovn_controller" (0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec)
Oct 07 21:31:16 compute-0 podman[94910]: 2025-10-07 21:31:16.829559893 +0000 UTC m=+0.099132052 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 21:31:16 compute-0 systemd[1]: 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec-3372cce139549ba0.service: Main process exited, code=exited, status=1/FAILURE
Oct 07 21:31:16 compute-0 systemd[1]: 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec-3372cce139549ba0.service: Failed with result 'exit-code'.
Oct 07 21:31:16 compute-0 systemd[1]: Reloading.
Oct 07 21:31:16 compute-0 systemd[94937]: Queued start job for default target Main User Target.
Oct 07 21:31:16 compute-0 systemd[94937]: Created slice User Application Slice.
Oct 07 21:31:16 compute-0 systemd[94937]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 07 21:31:16 compute-0 systemd[94937]: Started Daily Cleanup of User's Temporary Directories.
Oct 07 21:31:16 compute-0 systemd[94937]: Reached target Paths.
Oct 07 21:31:16 compute-0 systemd[94937]: Reached target Timers.
Oct 07 21:31:16 compute-0 systemd[94937]: Starting D-Bus User Message Bus Socket...
Oct 07 21:31:16 compute-0 systemd-sysv-generator[94996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:31:16 compute-0 systemd[94937]: Starting Create User's Volatile Files and Directories...
Oct 07 21:31:16 compute-0 systemd-rc-local-generator[94992]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:31:16 compute-0 systemd[94937]: Finished Create User's Volatile Files and Directories.
Oct 07 21:31:16 compute-0 systemd[94937]: Listening on D-Bus User Message Bus Socket.
Oct 07 21:31:16 compute-0 systemd[94937]: Reached target Sockets.
Oct 07 21:31:16 compute-0 systemd[94937]: Reached target Basic System.
Oct 07 21:31:16 compute-0 systemd[94937]: Reached target Main User Target.
Oct 07 21:31:16 compute-0 systemd[94937]: Startup finished in 134ms.
Oct 07 21:31:17 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 07 21:31:17 compute-0 systemd[1]: Started ovn_controller container.
Oct 07 21:31:17 compute-0 systemd[1]: Started Session c1 of User root.
Oct 07 21:31:17 compute-0 sudo[94847]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:17 compute-0 ovn_controller[94904]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:31:17 compute-0 ovn_controller[94904]: INFO:__main__:Validating config file
Oct 07 21:31:17 compute-0 ovn_controller[94904]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:31:17 compute-0 ovn_controller[94904]: INFO:__main__:Writing out command to execute
Oct 07 21:31:17 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 07 21:31:17 compute-0 ovn_controller[94904]: ++ cat /run_command
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + ARGS=
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + sudo kolla_copy_cacerts
Oct 07 21:31:17 compute-0 systemd[1]: Started Session c2 of User root.
Oct 07 21:31:17 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + [[ ! -n '' ]]
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + . kolla_extend_start
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 07 21:31:17 compute-0 ovn_controller[94904]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + umask 0022
Oct 07 21:31:17 compute-0 ovn_controller[94904]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Oct 07 21:31:17 compute-0 ovn_controller[94904]: 2025-10-07T21:31:17Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Oct 07 21:31:17 compute-0 NetworkManager[51722]: <info>  [1759872677.2904] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 07 21:31:17 compute-0 NetworkManager[51722]: <info>  [1759872677.2913] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 07 21:31:17 compute-0 NetworkManager[51722]: <info>  [1759872677.2923] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct 07 21:31:17 compute-0 NetworkManager[51722]: <info>  [1759872677.2927] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct 07 21:31:17 compute-0 NetworkManager[51722]: <info>  [1759872677.2931] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 07 21:31:17 compute-0 kernel: br-int: entered promiscuous mode
Oct 07 21:31:17 compute-0 systemd-udevd[95036]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:31:18 compute-0 sudo[95164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irdynldckzwleqxrvagsbptsrzpurfyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872677.7092052-1198-44674587549999/AnsiballZ_command.py'
Oct 07 21:31:18 compute-0 sudo[95164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00025|main|INFO|OVS feature set changed, force recompute.
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00034|features|INFO|OVS Feature: group_support, state: supported
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00035|main|INFO|OVS feature set changed, force recompute.
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 07 21:31:18 compute-0 ovn_controller[94904]: 2025-10-07T21:31:18Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 07 21:31:18 compute-0 NetworkManager[51722]: <info>  [1759872678.2901] manager: (ovn-88a841-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 07 21:31:18 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Oct 07 21:31:18 compute-0 NetworkManager[51722]: <info>  [1759872678.3219] device (genev_sys_6081): carrier: link connected
Oct 07 21:31:18 compute-0 NetworkManager[51722]: <info>  [1759872678.3226] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct 07 21:31:18 compute-0 systemd-udevd[95038]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:31:18 compute-0 python3.9[95166]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:31:18 compute-0 ovs-vsctl[95169]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 07 21:31:18 compute-0 sudo[95164]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:18 compute-0 NetworkManager[51722]: <info>  [1759872678.6903] manager: (ovn-89a2e2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct 07 21:31:19 compute-0 sudo[95319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iufwqxwcuolkgbkwiifbntsieqeyvgnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872678.7331526-1214-141193676393785/AnsiballZ_command.py'
Oct 07 21:31:19 compute-0 sudo[95319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:19 compute-0 python3.9[95321]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:31:19 compute-0 ovs-vsctl[95323]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 07 21:31:19 compute-0 sudo[95319]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:20 compute-0 sudo[95474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iggerhvfvkckldoeqyriyphfgchjlglt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872679.8930116-1242-42756876718009/AnsiballZ_command.py'
Oct 07 21:31:20 compute-0 sudo[95474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:20 compute-0 python3.9[95476]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:31:20 compute-0 ovs-vsctl[95477]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 07 21:31:20 compute-0 sudo[95474]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:21 compute-0 sshd-session[84403]: Connection closed by 192.168.122.30 port 50602
Oct 07 21:31:21 compute-0 sshd-session[84400]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:31:21 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Oct 07 21:31:21 compute-0 systemd[1]: session-21.scope: Consumed 50.162s CPU time.
Oct 07 21:31:21 compute-0 systemd-logind[798]: Session 21 logged out. Waiting for processes to exit.
Oct 07 21:31:21 compute-0 systemd-logind[798]: Removed session 21.
Oct 07 21:31:23 compute-0 sshd-session[93837]: Invalid user admin from 116.110.151.5 port 35902
Oct 07 21:31:24 compute-0 sshd-session[93837]: Connection closed by invalid user admin 116.110.151.5 port 35902 [preauth]
Oct 07 21:31:27 compute-0 sshd-session[95502]: Accepted publickey for zuul from 192.168.122.30 port 37948 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:31:27 compute-0 systemd-logind[798]: New session 23 of user zuul.
Oct 07 21:31:27 compute-0 systemd[1]: Started Session 23 of User zuul.
Oct 07 21:31:27 compute-0 sshd-session[95502]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:31:27 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 07 21:31:27 compute-0 systemd[94937]: Activating special unit Exit the Session...
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped target Main User Target.
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped target Basic System.
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped target Paths.
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped target Sockets.
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped target Timers.
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 07 21:31:27 compute-0 systemd[94937]: Closed D-Bus User Message Bus Socket.
Oct 07 21:31:27 compute-0 systemd[94937]: Stopped Create User's Volatile Files and Directories.
Oct 07 21:31:27 compute-0 systemd[94937]: Removed slice User Application Slice.
Oct 07 21:31:27 compute-0 systemd[94937]: Reached target Shutdown.
Oct 07 21:31:27 compute-0 systemd[94937]: Finished Exit the Session.
Oct 07 21:31:27 compute-0 systemd[94937]: Reached target Exit the Session.
Oct 07 21:31:27 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 07 21:31:27 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 07 21:31:27 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 07 21:31:27 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 07 21:31:27 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 07 21:31:27 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 07 21:31:27 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 07 21:31:28 compute-0 python3.9[95657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:31:29 compute-0 sudo[95811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feyqgngwgfprllskdacrnynglbedvhwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872688.963648-48-27809776762399/AnsiballZ_file.py'
Oct 07 21:31:29 compute-0 sudo[95811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:29 compute-0 python3.9[95813]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:29 compute-0 sudo[95811]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:30 compute-0 sudo[95963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raopfxegjittivfzsovwszekmdljyebf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872689.889919-48-81352418954963/AnsiballZ_file.py'
Oct 07 21:31:30 compute-0 sudo[95963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:30 compute-0 python3.9[95965]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:30 compute-0 sudo[95963]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:31 compute-0 sudo[96115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buwfuytrdyoexskoqqzbpelysujetoae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872690.7876563-48-99179522196217/AnsiballZ_file.py'
Oct 07 21:31:31 compute-0 sudo[96115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:31 compute-0 python3.9[96117]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:31 compute-0 sudo[96115]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:31 compute-0 sudo[96267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plvcoptubmhwlljkaexyifowluovqwmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872691.6008806-48-177188033223666/AnsiballZ_file.py'
Oct 07 21:31:31 compute-0 sudo[96267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:32 compute-0 python3.9[96269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:32 compute-0 sudo[96267]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:32 compute-0 sudo[96419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbyoqdopqlmuyouyqgbzvhpgywqfsujr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872692.3541224-48-22483033585759/AnsiballZ_file.py'
Oct 07 21:31:32 compute-0 sudo[96419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:32 compute-0 python3.9[96421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:32 compute-0 sudo[96419]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:33 compute-0 python3.9[96573]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:31:34 compute-0 sudo[96723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmduudzqptjajkiounscqsdtjjznckyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872694.1875842-136-88580726511693/AnsiballZ_seboolean.py'
Oct 07 21:31:34 compute-0 sudo[96723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:35 compute-0 python3.9[96725]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 07 21:31:35 compute-0 sshd-session[96551]: Invalid user admin from 116.110.151.5 port 50666
Oct 07 21:31:35 compute-0 sshd-session[96551]: Connection closed by invalid user admin 116.110.151.5 port 50666 [preauth]
Oct 07 21:31:35 compute-0 sudo[96723]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:36 compute-0 python3.9[96876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:37 compute-0 python3.9[96997]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872695.861138-152-259194361137795/.source follow=False _original_basename=haproxy.j2 checksum=5ff51f1b3524cccfb330e96118c6c6fa61bbd2c9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:38 compute-0 python3.9[97147]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:38 compute-0 python3.9[97268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872697.6105049-182-184675273196706/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:39 compute-0 sudo[97418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwfluyteqlfmroszptwrpvlskmesrzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872699.2612023-216-218049644891533/AnsiballZ_setup.py'
Oct 07 21:31:39 compute-0 sudo[97418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:40 compute-0 python3.9[97420]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:31:40 compute-0 sudo[97418]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:40 compute-0 sudo[97502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekylvceulmvgekwmwkeufrduaisgwpdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872699.2612023-216-218049644891533/AnsiballZ_dnf.py'
Oct 07 21:31:40 compute-0 sudo[97502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:41 compute-0 python3.9[97504]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:31:42 compute-0 sudo[97502]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:43 compute-0 sudo[97655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjnhahvnbjjoszsbrehauynijvxztzaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872702.5288804-240-47440835204741/AnsiballZ_systemd.py'
Oct 07 21:31:43 compute-0 sudo[97655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:43 compute-0 python3.9[97657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:31:43 compute-0 sudo[97655]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:44 compute-0 python3.9[97810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:44 compute-0 python3.9[97931]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872703.8876495-256-227936185855228/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:45 compute-0 python3.9[98081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:46 compute-0 python3.9[98202]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872705.1497355-256-195950513384874/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:47 compute-0 ovn_controller[94904]: 2025-10-07T21:31:47Z|00038|memory|INFO|17104 kB peak resident set size after 30.6 seconds
Oct 07 21:31:47 compute-0 ovn_controller[94904]: 2025-10-07T21:31:47Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Oct 07 21:31:47 compute-0 podman[98279]: 2025-10-07 21:31:47.866243753 +0000 UTC m=+0.103945471 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 21:31:48 compute-0 python3.9[98379]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:48 compute-0 python3.9[98500]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872707.64359-344-271245738957631/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:49 compute-0 python3.9[98650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:50 compute-0 python3.9[98771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872708.988589-344-61335835868323/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:51 compute-0 python3.9[98921]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:31:52 compute-0 sudo[99073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zybfvqmnehhiazgaaohyckvtpajwdfps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872711.6991746-420-204641006377367/AnsiballZ_file.py'
Oct 07 21:31:52 compute-0 sudo[99073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:52 compute-0 python3.9[99075]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:52 compute-0 sudo[99073]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:52 compute-0 sudo[99225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmaxqvhxyqxczjyzhizuoiieesoxbfcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872712.6147783-436-216560657275422/AnsiballZ_stat.py'
Oct 07 21:31:52 compute-0 sudo[99225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:53 compute-0 python3.9[99227]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:53 compute-0 sudo[99225]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:53 compute-0 sudo[99303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yasaahawvyxjcgtetygrkjzuepfjatfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872712.6147783-436-216560657275422/AnsiballZ_file.py'
Oct 07 21:31:53 compute-0 sudo[99303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:53 compute-0 python3.9[99305]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:53 compute-0 sudo[99303]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:54 compute-0 sudo[99455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoasetvutvasmnumisduegfrbljgguda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872713.8699512-436-71719700753768/AnsiballZ_stat.py'
Oct 07 21:31:54 compute-0 sudo[99455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:54 compute-0 python3.9[99457]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:54 compute-0 sudo[99455]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:54 compute-0 sudo[99533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awgavoqemjeuktsdimdavhggomdzomia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872713.8699512-436-71719700753768/AnsiballZ_file.py'
Oct 07 21:31:54 compute-0 sudo[99533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:54 compute-0 python3.9[99535]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:31:54 compute-0 sudo[99533]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:55 compute-0 sudo[99685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgdmlicaoqovvuelwvnzstnrkpusgmlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872715.4321353-482-45980951173175/AnsiballZ_file.py'
Oct 07 21:31:55 compute-0 sudo[99685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:56 compute-0 python3.9[99687]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:56 compute-0 sudo[99685]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:56 compute-0 sudo[99837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eddxegmxtdvreffdjjcbebuhwtattzkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872716.2489855-498-204896924917065/AnsiballZ_stat.py'
Oct 07 21:31:56 compute-0 sudo[99837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:56 compute-0 python3.9[99839]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:56 compute-0 sudo[99837]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:57 compute-0 sudo[99915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drnsxrrakdylqsltifirqunnxtbqxmgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872716.2489855-498-204896924917065/AnsiballZ_file.py'
Oct 07 21:31:57 compute-0 sudo[99915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:57 compute-0 python3.9[99917]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:57 compute-0 sudo[99915]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:58 compute-0 sudo[100067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yithglovkqpavtqjvydmtqcxyosdxctr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872717.7331786-522-8869503111980/AnsiballZ_stat.py'
Oct 07 21:31:58 compute-0 sudo[100067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:58 compute-0 python3.9[100069]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:31:58 compute-0 sudo[100067]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:58 compute-0 sudo[100145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euyreafhxvcqtpxqaukqczcliranojjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872717.7331786-522-8869503111980/AnsiballZ_file.py'
Oct 07 21:31:58 compute-0 sudo[100145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:58 compute-0 python3.9[100147]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:31:58 compute-0 sudo[100145]: pam_unix(sudo:session): session closed for user root
Oct 07 21:31:59 compute-0 sudo[100297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igivttycvwmibnpndafhgqtnfchaidqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872719.132184-546-12380190410137/AnsiballZ_systemd.py'
Oct 07 21:31:59 compute-0 sudo[100297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:31:59 compute-0 python3.9[100299]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:31:59 compute-0 systemd[1]: Reloading.
Oct 07 21:32:00 compute-0 systemd-rc-local-generator[100321]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:32:00 compute-0 systemd-sysv-generator[100329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:32:00 compute-0 sudo[100297]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:00 compute-0 sudo[100486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srfytycjkkiaipszquugrhvsiizncdya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872720.4826858-562-248037061045340/AnsiballZ_stat.py'
Oct 07 21:32:00 compute-0 sudo[100486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:01 compute-0 python3.9[100488]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:32:01 compute-0 sudo[100486]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:01 compute-0 sudo[100564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhrulgzsswdtmjgaozyadfviplztdmub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872720.4826858-562-248037061045340/AnsiballZ_file.py'
Oct 07 21:32:01 compute-0 sudo[100564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:01 compute-0 python3.9[100566]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:01 compute-0 sudo[100564]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:02 compute-0 sudo[100716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfgmvmmzhmpymrobnladourqfalnothn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872721.8768396-586-168237997269469/AnsiballZ_stat.py'
Oct 07 21:32:02 compute-0 sudo[100716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:02 compute-0 python3.9[100718]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:32:02 compute-0 sudo[100716]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:02 compute-0 sudo[100794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lddeihxstbrzxlxhpgilddnblhukihqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872721.8768396-586-168237997269469/AnsiballZ_file.py'
Oct 07 21:32:02 compute-0 sudo[100794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:02 compute-0 python3.9[100796]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:02 compute-0 sudo[100794]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:03 compute-0 sudo[100946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnogxchwgoecicawmeuqhhntmlhsvvmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872723.3397267-610-142671468145824/AnsiballZ_systemd.py'
Oct 07 21:32:03 compute-0 sudo[100946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:03 compute-0 python3.9[100948]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:03 compute-0 systemd[1]: Reloading.
Oct 07 21:32:04 compute-0 systemd-rc-local-generator[100976]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:32:04 compute-0 systemd-sysv-generator[100979]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:32:05 compute-0 systemd[1]: Starting Create netns directory...
Oct 07 21:32:05 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 07 21:32:05 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 07 21:32:05 compute-0 systemd[1]: Finished Create netns directory.
Oct 07 21:32:05 compute-0 sudo[100946]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:05 compute-0 sudo[101139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vehflhtjiamzbmuixjwxqbicqzdjmzxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872725.6036468-630-188194388090719/AnsiballZ_file.py'
Oct 07 21:32:05 compute-0 sudo[101139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:06 compute-0 python3.9[101141]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:32:06 compute-0 sudo[101139]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:06 compute-0 sudo[101291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmlpcvztrlsgdgepqnqkyqtupwfsiwsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872726.339945-646-94189641036375/AnsiballZ_stat.py'
Oct 07 21:32:06 compute-0 sudo[101291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:06 compute-0 python3.9[101293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:32:06 compute-0 sudo[101291]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:07 compute-0 sudo[101414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adpvshyrkbhmaxpmnkdosgyxgosapwgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872726.339945-646-94189641036375/AnsiballZ_copy.py'
Oct 07 21:32:07 compute-0 sudo[101414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:07 compute-0 python3.9[101416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759872726.339945-646-94189641036375/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:32:07 compute-0 sudo[101414]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:08 compute-0 sudo[101566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukbuatrcrrugbgobunjkixfbrchckyry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872728.1267948-680-220276962874900/AnsiballZ_file.py'
Oct 07 21:32:08 compute-0 sudo[101566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:08 compute-0 python3.9[101568]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:32:08 compute-0 sudo[101566]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:09 compute-0 sudo[101718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxxvmezdkjwykdvfroljxauczbsrautu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872729.0882096-696-2853892610044/AnsiballZ_stat.py'
Oct 07 21:32:09 compute-0 sudo[101718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:09 compute-0 python3.9[101720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:32:09 compute-0 sudo[101718]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:10 compute-0 sudo[101841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvjhpmnxfmpvsaxrwesjcwhdpfnizcrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872729.0882096-696-2853892610044/AnsiballZ_copy.py'
Oct 07 21:32:10 compute-0 sudo[101841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:10 compute-0 python3.9[101843]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759872729.0882096-696-2853892610044/.source.json _original_basename=.dzs2dl7b follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:10 compute-0 sudo[101841]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:11 compute-0 sudo[101994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyzilgtwcenefobnfshkgnxeiwgjrfds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872730.6562805-726-6468684958011/AnsiballZ_file.py'
Oct 07 21:32:11 compute-0 sudo[101994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:11 compute-0 python3.9[101996]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:11 compute-0 sudo[101994]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:11 compute-0 sudo[102146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqdihnfhmirvubignijvrxktvupursgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872731.5939062-742-165929350800283/AnsiballZ_stat.py'
Oct 07 21:32:11 compute-0 sudo[102146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:12 compute-0 sudo[102146]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:12 compute-0 sudo[102269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxxmqydezinbzdhhsfpclxapdlzlievb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872731.5939062-742-165929350800283/AnsiballZ_copy.py'
Oct 07 21:32:12 compute-0 sudo[102269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:12 compute-0 sudo[102269]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:13 compute-0 sudo[102421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnjhynupikrasaiqfsbygjbeoohjnmfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872733.2855024-776-91270853204750/AnsiballZ_container_config_data.py'
Oct 07 21:32:13 compute-0 sudo[102421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:14 compute-0 python3.9[102423]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 07 21:32:14 compute-0 sudo[102421]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:14 compute-0 sudo[102573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glqreszuabdporofuigfgfbwfvmxkqew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872734.26278-794-209954850146318/AnsiballZ_container_config_hash.py'
Oct 07 21:32:14 compute-0 sudo[102573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:15 compute-0 python3.9[102575]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:32:15 compute-0 sudo[102573]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:15 compute-0 sudo[102725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xioltnmzrymjzywyrvcjypurjlhqlbzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872735.3719542-812-47248617779486/AnsiballZ_podman_container_info.py'
Oct 07 21:32:15 compute-0 sudo[102725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:15 compute-0 python3.9[102727]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 07 21:32:16 compute-0 sudo[102725]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:17 compute-0 sudo[102903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywnmsxpwspymrvooyardojtphnwtxhl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759872736.8499496-838-188728037528853/AnsiballZ_edpm_container_manage.py'
Oct 07 21:32:17 compute-0 sudo[102903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:17 compute-0 python3[102905]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:32:17 compute-0 podman[102945]: 2025-10-07 21:32:17.819740325 +0000 UTC m=+0.058788582 container create c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 07 21:32:17 compute-0 podman[102945]: 2025-10-07 21:32:17.789691181 +0000 UTC m=+0.028739438 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 21:32:17 compute-0 python3[102905]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 21:32:18 compute-0 sudo[102903]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:18 compute-0 sudo[103148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frrwxlooghczbpjnxupjdkkgraupmtmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872738.4716015-854-5594279472094/AnsiballZ_stat.py'
Oct 07 21:32:18 compute-0 sudo[103148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:18 compute-0 podman[103107]: 2025-10-07 21:32:18.87123966 +0000 UTC m=+0.113214803 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Oct 07 21:32:19 compute-0 python3.9[103156]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:32:19 compute-0 sudo[103148]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:19 compute-0 sshd-session[102906]: Invalid user admin from 116.110.151.5 port 50056
Oct 07 21:32:19 compute-0 sudo[103314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npnttwhayljoyhlfhagmgfkgnqijlyxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872739.372039-872-1304895476602/AnsiballZ_file.py'
Oct 07 21:32:19 compute-0 sudo[103314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:19 compute-0 python3.9[103316]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:19 compute-0 sudo[103314]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:20 compute-0 sudo[103390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmzduvljnoobettftpzsuchzapqcwigq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872739.372039-872-1304895476602/AnsiballZ_stat.py'
Oct 07 21:32:20 compute-0 sudo[103390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:20 compute-0 python3.9[103392]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:32:20 compute-0 sudo[103390]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:20 compute-0 sshd-session[102906]: Connection closed by invalid user admin 116.110.151.5 port 50056 [preauth]
Oct 07 21:32:21 compute-0 sudo[103541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwhdwxdxynapokypxhurwykqvssrmmho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872740.5043201-872-11608981987852/AnsiballZ_copy.py'
Oct 07 21:32:21 compute-0 sudo[103541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:21 compute-0 python3.9[103543]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759872740.5043201-872-11608981987852/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:21 compute-0 sudo[103541]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:21 compute-0 sudo[103617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwpjtmxbclpyxwbhlorhceirpkgghelw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872740.5043201-872-11608981987852/AnsiballZ_systemd.py'
Oct 07 21:32:21 compute-0 sudo[103617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:22 compute-0 python3.9[103619]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:32:22 compute-0 systemd[1]: Reloading.
Oct 07 21:32:22 compute-0 systemd-rc-local-generator[103640]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:32:22 compute-0 systemd-sysv-generator[103645]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:32:22 compute-0 sudo[103617]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:22 compute-0 sudo[103728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrzylsbakvumfjwxgxpyejargxkatpjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872740.5043201-872-11608981987852/AnsiballZ_systemd.py'
Oct 07 21:32:22 compute-0 sudo[103728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:23 compute-0 python3.9[103730]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:23 compute-0 systemd[1]: Reloading.
Oct 07 21:32:23 compute-0 systemd-sysv-generator[103762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:32:23 compute-0 systemd-rc-local-generator[103758]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:32:23 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Oct 07 21:32:23 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:32:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f6b0b2c15d4401f07615365d6cafe184c4c3a205fb9d6ad6e86ff61960d91c7/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 07 21:32:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f6b0b2c15d4401f07615365d6cafe184c4c3a205fb9d6ad6e86ff61960d91c7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 21:32:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675.
Oct 07 21:32:23 compute-0 podman[103771]: 2025-10-07 21:32:23.567411738 +0000 UTC m=+0.163948848 container init c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + sudo -E kolla_set_configs
Oct 07 21:32:23 compute-0 podman[103771]: 2025-10-07 21:32:23.614802252 +0000 UTC m=+0.211339352 container start c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:32:23 compute-0 edpm-start-podman-container[103771]: ovn_metadata_agent
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Validating config file
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Copying service configuration files
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Writing out command to execute
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: ++ cat /run_command
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + CMD=neutron-ovn-metadata-agent
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + ARGS=
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + sudo kolla_copy_cacerts
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + [[ ! -n '' ]]
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + . kolla_extend_start
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: Running command: 'neutron-ovn-metadata-agent'
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + umask 0022
Oct 07 21:32:23 compute-0 ovn_metadata_agent[103786]: + exec neutron-ovn-metadata-agent
Oct 07 21:32:23 compute-0 podman[103793]: 2025-10-07 21:32:23.729327642 +0000 UTC m=+0.093077605 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 21:32:23 compute-0 edpm-start-podman-container[103770]: Creating additional drop-in dependency for "ovn_metadata_agent" (c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675)
Oct 07 21:32:23 compute-0 systemd[1]: Reloading.
Oct 07 21:32:23 compute-0 systemd-sysv-generator[103864]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:32:23 compute-0 systemd-rc-local-generator[103860]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:32:24 compute-0 systemd[1]: Started ovn_metadata_agent container.
Oct 07 21:32:24 compute-0 sudo[103728]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:24 compute-0 sshd-session[95505]: Connection closed by 192.168.122.30 port 37948
Oct 07 21:32:24 compute-0 sshd-session[95502]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:32:24 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Oct 07 21:32:24 compute-0 systemd[1]: session-23.scope: Consumed 38.849s CPU time.
Oct 07 21:32:24 compute-0 systemd-logind[798]: Session 23 logged out. Waiting for processes to exit.
Oct 07 21:32:24 compute-0 systemd-logind[798]: Removed session 23.
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.517 103791 INFO neutron.common.config [-] Logging enabled!
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.517 103791 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.517 103791 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.518 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.518 103791 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.518 103791 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.518 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.519 103791 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.520 103791 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.521 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.522 103791 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.523 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.524 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.525 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.103 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.526 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.527 103791 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.528 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.529 103791 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.530 103791 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.531 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.532 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.533 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.534 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.535 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.536 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.537 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.538 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.539 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.540 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.541 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.542 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.543 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.544 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.545 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.546 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.547 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.548 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.549 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.550 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.551 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.552 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.553 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.554 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.555 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.556 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.557 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.558 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.559 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.560 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.561 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.562 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.570 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.570 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.571 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.571 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.571 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.580 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name dca786dc-b408-4181-8e47-0e14c60f13da (UUID: dca786dc-b408-4181-8e47-0e14c60f13da) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.606 103791 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.607 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.607 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.607 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.607 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.609 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.614 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.621 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'dca786dc-b408-4181-8e47-0e14c60f13da'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], external_ids={}, name=dca786dc-b408-4181-8e47-0e14c60f13da, nb_cfg_timestamp=1759872686290, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:32:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:25.623 103791 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp11rzevkk/privsep.sock']
Oct 07 21:32:26 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.370 103791 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.371 103791 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp11rzevkk/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.215 103905 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.221 103905 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.223 103905 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.223 103905 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103905
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.373 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d4e3a8-1a60-4d0b-9aa1-d1303478f047]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.791 103905 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.791 103905 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:32:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:26.791 103905 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:32:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:27.243 103905 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 07 21:32:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:27.249 103905 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 07 21:32:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:27.283 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b2aafd-a273-4104-9bd4-3b4d98b710ee]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:32:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:27.284 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, column=external_ids, values=({'neutron:ovn-metadata-id': '8493deb0-ce8f-5095-898d-1613ed0a5683'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:32:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:27.313 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:32:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:32:27.321 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:32:29 compute-0 sshd-session[103910]: Accepted publickey for zuul from 192.168.122.30 port 59580 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:32:29 compute-0 systemd-logind[798]: New session 24 of user zuul.
Oct 07 21:32:29 compute-0 systemd[1]: Started Session 24 of User zuul.
Oct 07 21:32:29 compute-0 sshd-session[103910]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:32:30 compute-0 python3.9[104063]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:32:31 compute-0 sudo[104217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffkbytxcwhgjmktiqwgygwbxrudteckl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872751.3756196-48-136358774511473/AnsiballZ_command.py'
Oct 07 21:32:31 compute-0 sudo[104217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:32 compute-0 python3.9[104219]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:32:32 compute-0 sudo[104217]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:33 compute-0 sudo[104382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtanasuuxorzbrutwtfofjxbeauxlupe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872752.5964644-70-271747562845935/AnsiballZ_systemd_service.py'
Oct 07 21:32:33 compute-0 sudo[104382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:33 compute-0 python3.9[104384]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:32:33 compute-0 systemd[1]: Reloading.
Oct 07 21:32:33 compute-0 systemd-sysv-generator[104411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:32:33 compute-0 systemd-rc-local-generator[104408]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:32:33 compute-0 sudo[104382]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:34 compute-0 python3.9[104569]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:32:34 compute-0 network[104586]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:32:34 compute-0 network[104587]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:32:34 compute-0 network[104588]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:32:41 compute-0 sudo[104850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admtphorghyyeyxscaojxsgcpztjpexe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872761.1064763-108-153998833125250/AnsiballZ_systemd_service.py'
Oct 07 21:32:41 compute-0 sudo[104850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:41 compute-0 python3.9[104852]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:41 compute-0 sudo[104850]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:42 compute-0 sudo[105003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvwglckbmbelafdexwrpnvqasemavrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872762.0193017-108-187157217114629/AnsiballZ_systemd_service.py'
Oct 07 21:32:42 compute-0 sudo[105003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:42 compute-0 python3.9[105005]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:42 compute-0 sudo[105003]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:43 compute-0 sudo[105156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bglahtrrvlsxpjfwampewvxykizrrqvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872762.9292333-108-64642750444343/AnsiballZ_systemd_service.py'
Oct 07 21:32:43 compute-0 sudo[105156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:43 compute-0 python3.9[105158]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:43 compute-0 sudo[105156]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:44 compute-0 sudo[105309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olugxpvptkabidrnpbudrlawofuobbiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872763.817333-108-58262185166124/AnsiballZ_systemd_service.py'
Oct 07 21:32:44 compute-0 sudo[105309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:44 compute-0 python3.9[105311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:44 compute-0 sudo[105309]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:44 compute-0 sudo[105462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsdymogewmrswidlznnjpyohfscdqgou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872764.6185362-108-205470493663756/AnsiballZ_systemd_service.py'
Oct 07 21:32:44 compute-0 sudo[105462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:45 compute-0 python3.9[105464]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:45 compute-0 sudo[105462]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:45 compute-0 sudo[105615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zriqmwbvhdlvepbzuivjffiepfhdemlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872765.5417385-108-49546788354528/AnsiballZ_systemd_service.py'
Oct 07 21:32:45 compute-0 sudo[105615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:46 compute-0 python3.9[105617]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:46 compute-0 sudo[105615]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:46 compute-0 sudo[105768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ommgdculvvxmyorlfmpytczyktyybvsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872766.4595785-108-39238278890316/AnsiballZ_systemd_service.py'
Oct 07 21:32:46 compute-0 sudo[105768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:47 compute-0 python3.9[105770]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:32:47 compute-0 sudo[105768]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:48 compute-0 sudo[105930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owegwiwxayixnhpqyvhdprowepyouyxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872768.4006982-212-109046203232118/AnsiballZ_file.py'
Oct 07 21:32:48 compute-0 sudo[105930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:49 compute-0 podman[105895]: 2025-10-07 21:32:49.093076192 +0000 UTC m=+0.143220512 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Oct 07 21:32:49 compute-0 python3.9[105932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:49 compute-0 sudo[105930]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:49 compute-0 sudo[106097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pofpyvmodyexqfrvnnkpfuprjddaknla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872769.3864782-212-46650059651536/AnsiballZ_file.py'
Oct 07 21:32:49 compute-0 sudo[106097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:49 compute-0 python3.9[106099]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:50 compute-0 sudo[106097]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:50 compute-0 sudo[106249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmnnakzlctgsnxafuipajadivfvcwvxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872770.1741185-212-113039501485398/AnsiballZ_file.py'
Oct 07 21:32:50 compute-0 sudo[106249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:50 compute-0 python3.9[106251]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:50 compute-0 sudo[106249]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:51 compute-0 sudo[106401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjhzygyztbypeqifmsjcbydnnpvudrzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872770.9481292-212-274319997387755/AnsiballZ_file.py'
Oct 07 21:32:51 compute-0 sudo[106401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:51 compute-0 python3.9[106403]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:51 compute-0 sudo[106401]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:52 compute-0 sudo[106553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhcyppqgsxyodgosgzzrreiltnjmcaor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872771.6991293-212-119277294780467/AnsiballZ_file.py'
Oct 07 21:32:52 compute-0 sudo[106553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:52 compute-0 python3.9[106555]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:52 compute-0 sudo[106553]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:52 compute-0 sudo[106705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntlsnzqcqazdtceqppgrkfklpeqtxdtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872772.4330192-212-27672389163950/AnsiballZ_file.py'
Oct 07 21:32:52 compute-0 sudo[106705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:53 compute-0 python3.9[106707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:53 compute-0 sudo[106705]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:53 compute-0 sudo[106857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsquqoocnlxnafxaxrrbbcbwjtxlhaqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872773.2369392-212-220101276817939/AnsiballZ_file.py'
Oct 07 21:32:53 compute-0 sudo[106857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:53 compute-0 python3.9[106859]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:53 compute-0 sudo[106857]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:54 compute-0 sudo[107022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzaxrdfuzxrhftxnsmtwqobcuqdycsco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872774.3617668-312-87912764731924/AnsiballZ_file.py'
Oct 07 21:32:54 compute-0 sudo[107022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:54 compute-0 podman[106983]: 2025-10-07 21:32:54.810005493 +0000 UTC m=+0.094827850 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 07 21:32:55 compute-0 python3.9[107030]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:55 compute-0 sudo[107022]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:55 compute-0 sudo[107182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pirhopdvdsxslfkowwvpxwykqrsvaayk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872775.1767426-312-6513067895723/AnsiballZ_file.py'
Oct 07 21:32:55 compute-0 sudo[107182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:55 compute-0 python3.9[107184]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:55 compute-0 sudo[107182]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:56 compute-0 sudo[107334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emqzzfrkdugtrgvhzhqwletcandfjymn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872775.8841743-312-212111231305030/AnsiballZ_file.py'
Oct 07 21:32:56 compute-0 sudo[107334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:56 compute-0 python3.9[107336]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:56 compute-0 sudo[107334]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:56 compute-0 sudo[107486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzmcwxpjqpvpwfrmskytxvojdjqaikyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872776.4812994-312-100044388788300/AnsiballZ_file.py'
Oct 07 21:32:56 compute-0 sudo[107486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:56 compute-0 python3.9[107488]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:56 compute-0 sudo[107486]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:57 compute-0 sudo[107638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujpbhhhddadbhbltoqzkyzvdjmgvscvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872777.0843544-312-10400782830663/AnsiballZ_file.py'
Oct 07 21:32:57 compute-0 sudo[107638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:57 compute-0 python3.9[107640]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:57 compute-0 sudo[107638]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:58 compute-0 sudo[107790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paljrmtkrtqcawoszcaobrupkyfbofaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872777.837303-312-57633312549146/AnsiballZ_file.py'
Oct 07 21:32:58 compute-0 sudo[107790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:58 compute-0 python3.9[107792]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:58 compute-0 sudo[107790]: pam_unix(sudo:session): session closed for user root
Oct 07 21:32:58 compute-0 sudo[107942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snyvtpbsmzxlubdvjtotzuhqvbmprkcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872778.455537-312-166053904291478/AnsiballZ_file.py'
Oct 07 21:32:58 compute-0 sudo[107942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:32:58 compute-0 python3.9[107944]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:32:58 compute-0 sudo[107942]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:00 compute-0 sudo[108094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikkjhkhsnftxziyvtzcruvaesumroziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872780.0281887-414-12536779460720/AnsiballZ_command.py'
Oct 07 21:33:00 compute-0 sudo[108094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:00 compute-0 python3.9[108096]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:00 compute-0 sudo[108094]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:01 compute-0 python3.9[108248]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 07 21:33:02 compute-0 sudo[108398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hanfmzhrpzyfmpsybjnysbdpyimphtfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872781.9786003-450-140407862622173/AnsiballZ_systemd_service.py'
Oct 07 21:33:02 compute-0 sudo[108398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:02 compute-0 python3.9[108400]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:33:02 compute-0 systemd[1]: Reloading.
Oct 07 21:33:02 compute-0 systemd-rc-local-generator[108424]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:33:02 compute-0 systemd-sysv-generator[108427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:33:03 compute-0 sudo[108398]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:03 compute-0 sudo[108585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mztrypsnoltfpjuzluajygghfjuljznq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872783.380994-466-226128101372732/AnsiballZ_command.py'
Oct 07 21:33:03 compute-0 sudo[108585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:03 compute-0 python3.9[108587]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:03 compute-0 sudo[108585]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:04 compute-0 sudo[108738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sanwuggfykmbjotzjswtmojqaxsnaqwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872784.0320199-466-28306729359001/AnsiballZ_command.py'
Oct 07 21:33:04 compute-0 sudo[108738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:04 compute-0 python3.9[108740]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:04 compute-0 sudo[108738]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:05 compute-0 sudo[108891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arsjvinernjzcaicparpuylhkfachlaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872784.787898-466-266747341205654/AnsiballZ_command.py'
Oct 07 21:33:05 compute-0 sudo[108891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:05 compute-0 python3.9[108893]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:05 compute-0 sudo[108891]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:05 compute-0 sudo[109044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbjwqwdfpicckbnyrowcvwklntpzxnor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872785.445187-466-163801907335220/AnsiballZ_command.py'
Oct 07 21:33:05 compute-0 sudo[109044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:05 compute-0 python3.9[109046]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:05 compute-0 sudo[109044]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:06 compute-0 sudo[109197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svitbghpymemmyiuwnmkzsvbagasbfgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872786.1364563-466-172223111910367/AnsiballZ_command.py'
Oct 07 21:33:06 compute-0 sudo[109197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:06 compute-0 python3.9[109199]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:06 compute-0 sudo[109197]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:07 compute-0 sudo[109350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tomelavzlrjdmztgopibecxcdsqdoyod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872786.8964868-466-213152860389163/AnsiballZ_command.py'
Oct 07 21:33:07 compute-0 sudo[109350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:07 compute-0 python3.9[109352]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:07 compute-0 sudo[109350]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:08 compute-0 sudo[109503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tonbspjodjgmunhunpoiddzgipagzlgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872787.7982993-466-23983252308864/AnsiballZ_command.py'
Oct 07 21:33:08 compute-0 sudo[109503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:08 compute-0 python3.9[109505]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:33:08 compute-0 sudo[109503]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:10 compute-0 sudo[109656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehbuzuncaichlxxriomjawhoxtvimkcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872789.9002366-574-170081146710534/AnsiballZ_getent.py'
Oct 07 21:33:10 compute-0 sudo[109656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:10 compute-0 python3.9[109658]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 07 21:33:10 compute-0 sudo[109656]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:11 compute-0 sshd-session[106984]: Invalid user admin from 116.110.151.5 port 32872
Oct 07 21:33:11 compute-0 sshd-session[106984]: Connection closed by invalid user admin 116.110.151.5 port 32872 [preauth]
Oct 07 21:33:11 compute-0 sudo[109809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkmxmauqsxaolokboolqhimbolikhmex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872791.049065-590-25980020230073/AnsiballZ_group.py'
Oct 07 21:33:11 compute-0 sudo[109809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:11 compute-0 python3.9[109811]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 07 21:33:11 compute-0 groupadd[109812]: group added to /etc/group: name=libvirt, GID=42473
Oct 07 21:33:11 compute-0 groupadd[109812]: group added to /etc/gshadow: name=libvirt
Oct 07 21:33:11 compute-0 groupadd[109812]: new group: name=libvirt, GID=42473
Oct 07 21:33:12 compute-0 sudo[109809]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:13 compute-0 sudo[109967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhzricjcpyhomlajjvkralqewqvhgfti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872792.448582-606-225883432264223/AnsiballZ_user.py'
Oct 07 21:33:13 compute-0 sudo[109967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:13 compute-0 python3.9[109969]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 07 21:33:13 compute-0 useradd[109971]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Oct 07 21:33:13 compute-0 sudo[109967]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:14 compute-0 sudo[110127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kezdutuluklnyajiuqyikotvzuxastna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872793.9320014-628-103474600537010/AnsiballZ_setup.py'
Oct 07 21:33:14 compute-0 sudo[110127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:14 compute-0 python3.9[110129]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:33:14 compute-0 sudo[110127]: pam_unix(sudo:session): session closed for user root
Oct 07 21:33:15 compute-0 sudo[110211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzkbrtwzkpqvwmgouodnzgfzasbktgod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872793.9320014-628-103474600537010/AnsiballZ_dnf.py'
Oct 07 21:33:15 compute-0 sudo[110211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:33:15 compute-0 python3.9[110213]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:33:19 compute-0 podman[110224]: 2025-10-07 21:33:19.933957688 +0000 UTC m=+0.165911009 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Oct 07 21:33:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:33:25.564 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:33:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:33:25.564 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:33:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:33:25.564 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:33:25 compute-0 podman[110357]: 2025-10-07 21:33:25.805019973 +0000 UTC m=+0.047802727 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007)
Oct 07 21:33:27 compute-0 sshd-session[110250]: Invalid user user from 116.110.151.5 port 37154
Oct 07 21:33:27 compute-0 sshd-session[110250]: Connection closed by invalid user user 116.110.151.5 port 37154 [preauth]
Oct 07 21:33:43 compute-0 kernel: SELinux:  Converting 2753 SID table entries...
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:33:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:33:50 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct 07 21:33:50 compute-0 podman[110459]: 2025-10-07 21:33:50.883560133 +0000 UTC m=+0.107731458 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:33:52 compute-0 kernel: SELinux:  Converting 2753 SID table entries...
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:33:52 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:33:56 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 07 21:33:56 compute-0 podman[110490]: 2025-10-07 21:33:56.827417678 +0000 UTC m=+0.060617560 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Oct 07 21:34:07 compute-0 sshd-session[110645]: Invalid user admin from 116.110.151.5 port 37272
Oct 07 21:34:07 compute-0 sshd-session[110645]: Connection closed by invalid user admin 116.110.151.5 port 37272 [preauth]
Oct 07 21:34:21 compute-0 podman[118839]: 2025-10-07 21:34:21.86478621 +0000 UTC m=+0.107724127 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 21:34:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:34:25.565 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:34:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:34:25.565 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:34:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:34:25.566 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:34:27 compute-0 podman[122334]: 2025-10-07 21:34:27.824858662 +0000 UTC m=+0.056098372 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct 07 21:34:49 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 07 21:34:49 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 07 21:34:50 compute-0 groupadd[127318]: group added to /etc/group: name=dnsmasq, GID=992
Oct 07 21:34:50 compute-0 groupadd[127318]: group added to /etc/gshadow: name=dnsmasq
Oct 07 21:34:50 compute-0 groupadd[127318]: new group: name=dnsmasq, GID=992
Oct 07 21:34:50 compute-0 useradd[127325]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 07 21:34:50 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:34:50 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 07 21:34:50 compute-0 dbus-broker-launch[763]: Noticed file-system modification, trigger reload.
Oct 07 21:34:51 compute-0 groupadd[127338]: group added to /etc/group: name=clevis, GID=991
Oct 07 21:34:51 compute-0 groupadd[127338]: group added to /etc/gshadow: name=clevis
Oct 07 21:34:51 compute-0 groupadd[127338]: new group: name=clevis, GID=991
Oct 07 21:34:51 compute-0 useradd[127345]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 07 21:34:51 compute-0 usermod[127355]: add 'clevis' to group 'tss'
Oct 07 21:34:51 compute-0 usermod[127355]: add 'clevis' to shadow group 'tss'
Oct 07 21:34:52 compute-0 podman[127365]: 2025-10-07 21:34:52.31384344 +0000 UTC m=+0.211522943 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 21:34:53 compute-0 polkitd[6157]: Reloading rules
Oct 07 21:34:53 compute-0 polkitd[6157]: Collecting garbage unconditionally...
Oct 07 21:34:53 compute-0 polkitd[6157]: Loading rules from directory /etc/polkit-1/rules.d
Oct 07 21:34:53 compute-0 polkitd[6157]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 07 21:34:53 compute-0 polkitd[6157]: Finished loading, compiling and executing 4 rules
Oct 07 21:34:53 compute-0 polkitd[6157]: Reloading rules
Oct 07 21:34:53 compute-0 polkitd[6157]: Collecting garbage unconditionally...
Oct 07 21:34:53 compute-0 polkitd[6157]: Loading rules from directory /etc/polkit-1/rules.d
Oct 07 21:34:53 compute-0 polkitd[6157]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 07 21:34:53 compute-0 polkitd[6157]: Finished loading, compiling and executing 4 rules
Oct 07 21:34:55 compute-0 groupadd[127569]: group added to /etc/group: name=ceph, GID=167
Oct 07 21:34:55 compute-0 groupadd[127569]: group added to /etc/gshadow: name=ceph
Oct 07 21:34:55 compute-0 groupadd[127569]: new group: name=ceph, GID=167
Oct 07 21:34:55 compute-0 useradd[127575]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 07 21:34:58 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Oct 07 21:34:58 compute-0 sshd[1006]: Received signal 15; terminating.
Oct 07 21:34:58 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Oct 07 21:34:58 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Oct 07 21:34:58 compute-0 systemd[1]: sshd.service: Consumed 2.147s CPU time, read 0B from disk, written 32.0K to disk.
Oct 07 21:34:58 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Oct 07 21:34:58 compute-0 systemd[1]: Stopping sshd-keygen.target...
Oct 07 21:34:58 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 07 21:34:58 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 07 21:34:58 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 07 21:34:58 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct 07 21:34:58 compute-0 podman[128090]: 2025-10-07 21:34:58.828319847 +0000 UTC m=+0.068394521 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 21:34:58 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct 07 21:34:58 compute-0 sshd[128111]: Server listening on 0.0.0.0 port 22.
Oct 07 21:34:58 compute-0 sshd[128111]: Server listening on :: port 22.
Oct 07 21:34:58 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct 07 21:35:01 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:35:01 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:35:01 compute-0 systemd[1]: Reloading.
Oct 07 21:35:01 compute-0 systemd-rc-local-generator[128371]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:01 compute-0 systemd-sysv-generator[128374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:01 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:35:03 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 07 21:35:03 compute-0 PackageKit[130197]: daemon start
Oct 07 21:35:03 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 07 21:35:03 compute-0 sudo[110211]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:10 compute-0 sudo[135377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsozweugnnfmkqhmbwhtombsuefeudve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872909.5298586-652-166054002987010/AnsiballZ_systemd.py'
Oct 07 21:35:10 compute-0 sudo[135377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:10 compute-0 python3.9[135404]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:35:10 compute-0 systemd[1]: Reloading.
Oct 07 21:35:10 compute-0 systemd-rc-local-generator[135782]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:10 compute-0 systemd-sysv-generator[135787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:10 compute-0 sudo[135377]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:11 compute-0 sudo[136403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiwkysjjlvbcyagfewjfiiwqlfouaohk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872911.133469-652-280782659997785/AnsiballZ_systemd.py'
Oct 07 21:35:11 compute-0 sudo[136403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:11 compute-0 python3.9[136424]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:35:11 compute-0 systemd[1]: Reloading.
Oct 07 21:35:12 compute-0 systemd-sysv-generator[136799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:12 compute-0 systemd-rc-local-generator[136796]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:12 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:35:12 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:35:12 compute-0 systemd[1]: man-db-cache-update.service: Consumed 14.252s CPU time.
Oct 07 21:35:12 compute-0 systemd[1]: run-re184eb5760d44d6591ac15ca6abb7ab2.service: Deactivated successfully.
Oct 07 21:35:13 compute-0 sudo[136403]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:13 compute-0 sudo[137144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txceannhhsczhdfhbfckhlvejevrsvez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872913.4117548-652-57819716470397/AnsiballZ_systemd.py'
Oct 07 21:35:13 compute-0 sudo[137144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:14 compute-0 python3.9[137146]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:35:14 compute-0 systemd[1]: Reloading.
Oct 07 21:35:14 compute-0 systemd-sysv-generator[137179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:14 compute-0 systemd-rc-local-generator[137174]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:14 compute-0 sudo[137144]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:15 compute-0 sudo[137335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soihktcjhpsckcbhyvohpzxvtbgffewz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872914.5991693-652-196311970787467/AnsiballZ_systemd.py'
Oct 07 21:35:15 compute-0 sudo[137335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:15 compute-0 python3.9[137337]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:35:15 compute-0 systemd[1]: Reloading.
Oct 07 21:35:15 compute-0 systemd-rc-local-generator[137366]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:15 compute-0 systemd-sysv-generator[137370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:15 compute-0 sudo[137335]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:16 compute-0 sudo[137525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmultkcuziqfndfbgbhmfnzhvunigikz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872916.1105666-710-204891662765325/AnsiballZ_systemd.py'
Oct 07 21:35:16 compute-0 sudo[137525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:16 compute-0 python3.9[137527]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:16 compute-0 systemd[1]: Reloading.
Oct 07 21:35:17 compute-0 systemd-rc-local-generator[137558]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:17 compute-0 systemd-sysv-generator[137562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:17 compute-0 sudo[137525]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:17 compute-0 sudo[137716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkgcpsnblzlnhpskfpjpispgvxaqkwsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872917.3600585-710-203944912483353/AnsiballZ_systemd.py'
Oct 07 21:35:17 compute-0 sudo[137716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:18 compute-0 python3.9[137719]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:18 compute-0 systemd[1]: Reloading.
Oct 07 21:35:18 compute-0 systemd-sysv-generator[137755]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:18 compute-0 systemd-rc-local-generator[137751]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:18 compute-0 unix_chkpwd[137758]: password check failed for user (root)
Oct 07 21:35:18 compute-0 sshd-session[137714]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 21:35:18 compute-0 sudo[137716]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:19 compute-0 sudo[137908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdlsijdubfnoienkdzhrjshusfyftpnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872918.6602786-710-124766105608852/AnsiballZ_systemd.py'
Oct 07 21:35:19 compute-0 sudo[137908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:19 compute-0 python3.9[137910]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:19 compute-0 systemd[1]: Reloading.
Oct 07 21:35:19 compute-0 systemd-sysv-generator[137940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:19 compute-0 systemd-rc-local-generator[137934]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:19 compute-0 sudo[137908]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:20 compute-0 sshd-session[137714]: Failed password for root from 193.46.255.244 port 19844 ssh2
Oct 07 21:35:20 compute-0 sudo[138099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oezrpvrcermdaiwhgfaxoeavpbyixupe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872919.9105992-710-277257488161197/AnsiballZ_systemd.py'
Oct 07 21:35:20 compute-0 sudo[138099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:20 compute-0 unix_chkpwd[138102]: password check failed for user (root)
Oct 07 21:35:20 compute-0 python3.9[138101]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:20 compute-0 sudo[138099]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:21 compute-0 sudo[138256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqfxgaesutdznsorigulgozfdehvzntv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872920.902952-710-246108032823572/AnsiballZ_systemd.py'
Oct 07 21:35:21 compute-0 sudo[138256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:21 compute-0 python3.9[138258]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:21 compute-0 systemd[1]: Reloading.
Oct 07 21:35:21 compute-0 systemd-sysv-generator[138292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:21 compute-0 systemd-rc-local-generator[138289]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:21 compute-0 sudo[138256]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:22 compute-0 sshd-session[137714]: Failed password for root from 193.46.255.244 port 19844 ssh2
Oct 07 21:35:22 compute-0 unix_chkpwd[138398]: password check failed for user (root)
Oct 07 21:35:22 compute-0 sudo[138472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tegidjbzjucnezvmfyysagcgzekocklp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872922.4693604-782-256356619060492/AnsiballZ_systemd.py'
Oct 07 21:35:22 compute-0 sudo[138472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:22 compute-0 podman[138399]: 2025-10-07 21:35:22.889665308 +0000 UTC m=+0.118610862 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 21:35:23 compute-0 python3.9[138477]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 07 21:35:23 compute-0 systemd[1]: Reloading.
Oct 07 21:35:23 compute-0 systemd-rc-local-generator[138502]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:35:23 compute-0 systemd-sysv-generator[138506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:35:23 compute-0 sshd-session[138297]: Invalid user manager from 103.115.24.11 port 40440
Oct 07 21:35:23 compute-0 sshd-session[138297]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:35:23 compute-0 sshd-session[138297]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 21:35:23 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 07 21:35:23 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 07 21:35:23 compute-0 sudo[138472]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:24 compute-0 sudo[138668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmbykulyejtotjycqccozuqvwkbgvnuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872923.9562578-798-146116899342328/AnsiballZ_systemd.py'
Oct 07 21:35:24 compute-0 sudo[138668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:24 compute-0 python3.9[138670]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:25 compute-0 sshd-session[138297]: Failed password for invalid user manager from 103.115.24.11 port 40440 ssh2
Oct 07 21:35:25 compute-0 sshd-session[137714]: Failed password for root from 193.46.255.244 port 19844 ssh2
Oct 07 21:35:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:35:25.567 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:35:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:35:25.568 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:35:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:35:25.568 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:35:25 compute-0 sshd-session[138297]: Received disconnect from 103.115.24.11 port 40440:11: Bye Bye [preauth]
Oct 07 21:35:25 compute-0 sshd-session[138297]: Disconnected from invalid user manager 103.115.24.11 port 40440 [preauth]
Oct 07 21:35:25 compute-0 sudo[138668]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:26 compute-0 sudo[138824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsjxfghcdfymdlshjvjuoollmrhryzma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872925.7973857-798-6861185153796/AnsiballZ_systemd.py'
Oct 07 21:35:26 compute-0 sudo[138824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:26 compute-0 python3.9[138826]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:26 compute-0 sudo[138824]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:26 compute-0 sudo[138979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbuijbdicmhtsbgxgqfqvjxgfvntvaro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872926.5896742-798-250733503211430/AnsiballZ_systemd.py'
Oct 07 21:35:26 compute-0 sudo[138979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:26 compute-0 sshd-session[137714]: Received disconnect from 193.46.255.244 port 19844:11:  [preauth]
Oct 07 21:35:26 compute-0 sshd-session[137714]: Disconnected from authenticating user root 193.46.255.244 port 19844 [preauth]
Oct 07 21:35:26 compute-0 sshd-session[137714]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 21:35:27 compute-0 python3.9[138981]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:27 compute-0 unix_chkpwd[138986]: password check failed for user (ftp)
Oct 07 21:35:27 compute-0 sshd-session[138098]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=ftp
Oct 07 21:35:27 compute-0 unix_chkpwd[138987]: password check failed for user (root)
Oct 07 21:35:27 compute-0 sshd-session[138982]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 21:35:28 compute-0 sudo[138979]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:28 compute-0 sudo[139138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvcqsxgshrfcynklcjryskjsysokyfpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872928.4093082-798-105265673055572/AnsiballZ_systemd.py'
Oct 07 21:35:28 compute-0 sudo[139138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:29 compute-0 python3.9[139140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:29 compute-0 sudo[139138]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:29 compute-0 podman[139142]: 2025-10-07 21:35:29.22408125 +0000 UTC m=+0.097217750 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 21:35:29 compute-0 sudo[139313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jatbntyqcmxskzjhcjghmpppblbssqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872929.3673818-798-9238533169204/AnsiballZ_systemd.py'
Oct 07 21:35:29 compute-0 sudo[139313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:30 compute-0 python3.9[139315]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:30 compute-0 sshd-session[138098]: Failed password for ftp from 116.110.151.5 port 53284 ssh2
Oct 07 21:35:30 compute-0 sudo[139313]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:30 compute-0 sshd-session[138982]: Failed password for root from 193.46.255.244 port 63592 ssh2
Oct 07 21:35:30 compute-0 sudo[139468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vejuiuuqnigirmlybhtwambturyiwgng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872930.3159587-798-114633398178115/AnsiballZ_systemd.py'
Oct 07 21:35:30 compute-0 sudo[139468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:31 compute-0 python3.9[139470]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:31 compute-0 sudo[139468]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:31 compute-0 sudo[139623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyvxjmfddlpqvwvgntrsxmzmubunehed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872931.255993-798-222664922572168/AnsiballZ_systemd.py'
Oct 07 21:35:31 compute-0 sudo[139623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:31 compute-0 python3.9[139625]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:31 compute-0 sudo[139623]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:32 compute-0 unix_chkpwd[139654]: password check failed for user (root)
Oct 07 21:35:32 compute-0 sudo[139779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-totoxmqzigbqhhrcphijkrlhpsozhqfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872932.0400677-798-41176343308100/AnsiballZ_systemd.py'
Oct 07 21:35:32 compute-0 sudo[139779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:32 compute-0 sshd-session[138098]: Connection closed by authenticating user ftp 116.110.151.5 port 53284 [preauth]
Oct 07 21:35:32 compute-0 python3.9[139781]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:32 compute-0 sudo[139779]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:33 compute-0 sudo[139934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvzvzeukeykupnveeynhlvkgklcqrefg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872932.9467874-798-234682119878201/AnsiballZ_systemd.py'
Oct 07 21:35:33 compute-0 sudo[139934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:33 compute-0 python3.9[139936]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:33 compute-0 sshd-session[138982]: Failed password for root from 193.46.255.244 port 63592 ssh2
Oct 07 21:35:34 compute-0 unix_chkpwd[139940]: password check failed for user (root)
Oct 07 21:35:34 compute-0 sudo[139934]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:35 compute-0 sudo[140090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttrtjoktmsljkusubvsfmvbldqmbtdzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872934.7760122-798-57849349648914/AnsiballZ_systemd.py'
Oct 07 21:35:35 compute-0 sudo[140090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:35 compute-0 python3.9[140092]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:35 compute-0 sudo[140090]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:35 compute-0 sudo[140245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prznfrdsfszpjqntnmjeabxrdgucpgdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872935.6677523-798-39199693597851/AnsiballZ_systemd.py'
Oct 07 21:35:35 compute-0 sudo[140245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:36 compute-0 python3.9[140247]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:36 compute-0 sudo[140245]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:36 compute-0 sshd-session[138982]: Failed password for root from 193.46.255.244 port 63592 ssh2
Oct 07 21:35:36 compute-0 sudo[140400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhjbqfjqitpcrubzcyozgkpuyndldgve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872936.5258274-798-243033398790689/AnsiballZ_systemd.py'
Oct 07 21:35:36 compute-0 sudo[140400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:37 compute-0 python3.9[140402]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:37 compute-0 sudo[140400]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:37 compute-0 sudo[140555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjvcwurfbdqmegnzakldsklxapuyuxvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872937.3417823-798-219761779332950/AnsiballZ_systemd.py'
Oct 07 21:35:37 compute-0 sudo[140555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:37 compute-0 python3.9[140557]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:38 compute-0 sudo[140555]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:38 compute-0 sshd-session[138982]: Received disconnect from 193.46.255.244 port 63592:11:  [preauth]
Oct 07 21:35:38 compute-0 sshd-session[138982]: Disconnected from authenticating user root 193.46.255.244 port 63592 [preauth]
Oct 07 21:35:38 compute-0 sshd-session[138982]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 21:35:38 compute-0 sudo[140710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwwnmknndemytucjqahohcdqrhjzdkiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872938.1498227-798-88816556341727/AnsiballZ_systemd.py'
Oct 07 21:35:38 compute-0 sudo[140710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:38 compute-0 python3.9[140712]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 07 21:35:38 compute-0 sudo[140710]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:39 compute-0 unix_chkpwd[140742]: password check failed for user (root)
Oct 07 21:35:39 compute-0 sshd-session[140713]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 21:35:39 compute-0 sudo[140868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oszzhjofmashebszlawvkbpzppvcvlev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872939.306448-1002-186862477334337/AnsiballZ_file.py'
Oct 07 21:35:39 compute-0 sudo[140868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:39 compute-0 python3.9[140870]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:35:39 compute-0 sudo[140868]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:40 compute-0 sudo[141020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbndtxdcjbtfjrpblyhqwvlyrigaqaiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872940.0733764-1002-233584104234471/AnsiballZ_file.py'
Oct 07 21:35:40 compute-0 sudo[141020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:40 compute-0 python3.9[141022]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:35:40 compute-0 sudo[141020]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:41 compute-0 sudo[141172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xissxhydugxlvjwnakrnaedmymtbfpha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872940.8372989-1002-98414708849021/AnsiballZ_file.py'
Oct 07 21:35:41 compute-0 sudo[141172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:41 compute-0 sshd-session[140713]: Failed password for root from 193.46.255.244 port 47906 ssh2
Oct 07 21:35:41 compute-0 python3.9[141174]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:35:41 compute-0 sudo[141172]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:41 compute-0 sudo[141324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huktyjesfdonavzhkuqumpfcyqvieirp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872941.7057567-1002-104366769438595/AnsiballZ_file.py'
Oct 07 21:35:41 compute-0 sudo[141324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:42 compute-0 python3.9[141326]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:35:42 compute-0 sudo[141324]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:42 compute-0 sudo[141476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tihsftznwuurwhovcovdpvclhncgsjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872942.3498414-1002-198400545853032/AnsiballZ_file.py'
Oct 07 21:35:42 compute-0 sudo[141476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:42 compute-0 python3.9[141478]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:35:42 compute-0 sudo[141476]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:43 compute-0 unix_chkpwd[141625]: password check failed for user (root)
Oct 07 21:35:43 compute-0 sudo[141629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxtkaavflqmrgzzxipnwlgnwsycvjedu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872943.1145089-1002-74947632850377/AnsiballZ_file.py'
Oct 07 21:35:43 compute-0 sudo[141629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:43 compute-0 python3.9[141631]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:35:43 compute-0 sudo[141629]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:44 compute-0 sudo[141781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlmmbthfcaqfxvpefaozbznkljjcfujb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872944.0579495-1088-10422179820072/AnsiballZ_stat.py'
Oct 07 21:35:44 compute-0 sudo[141781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:44 compute-0 python3.9[141783]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:44 compute-0 sudo[141781]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:45 compute-0 sshd-session[140713]: Failed password for root from 193.46.255.244 port 47906 ssh2
Oct 07 21:35:45 compute-0 sudo[141906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djelsmakktpnswhlimgbtlnnxysqthsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872944.0579495-1088-10422179820072/AnsiballZ_copy.py'
Oct 07 21:35:45 compute-0 sudo[141906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:45 compute-0 python3.9[141908]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872944.0579495-1088-10422179820072/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:45 compute-0 sudo[141906]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:45 compute-0 unix_chkpwd[141913]: password check failed for user (root)
Oct 07 21:35:46 compute-0 sudo[142059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fposshecalyqnusxkqydpxusjzieoyqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872945.7135098-1088-155208579406496/AnsiballZ_stat.py'
Oct 07 21:35:46 compute-0 sudo[142059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:46 compute-0 python3.9[142061]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:46 compute-0 sudo[142059]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:46 compute-0 sudo[142184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrvreprcqtkvfpknalawqzzqaebodmat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872945.7135098-1088-155208579406496/AnsiballZ_copy.py'
Oct 07 21:35:46 compute-0 sudo[142184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:46 compute-0 python3.9[142186]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872945.7135098-1088-155208579406496/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:46 compute-0 sudo[142184]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:47 compute-0 sudo[142336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhlqlftqbaghroyogmvnolnxocngyryr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872947.0907845-1088-32415127000235/AnsiballZ_stat.py'
Oct 07 21:35:47 compute-0 sudo[142336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:47 compute-0 python3.9[142338]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:47 compute-0 sudo[142336]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:47 compute-0 sshd-session[140713]: Failed password for root from 193.46.255.244 port 47906 ssh2
Oct 07 21:35:48 compute-0 sudo[142461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-togtsagplbejipilqbrccnziblkdhzoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872947.0907845-1088-32415127000235/AnsiballZ_copy.py'
Oct 07 21:35:48 compute-0 sudo[142461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:48 compute-0 python3.9[142463]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872947.0907845-1088-32415127000235/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:48 compute-0 sudo[142461]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:48 compute-0 sudo[142613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xygfytanfjvoloyxibwplqbwpnjtpqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872948.5360518-1088-52240144120246/AnsiballZ_stat.py'
Oct 07 21:35:48 compute-0 sudo[142613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:49 compute-0 python3.9[142615]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:49 compute-0 sudo[142613]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:49 compute-0 sudo[142738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwhqkuwehrlctdedbdxegiuwhvogabrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872948.5360518-1088-52240144120246/AnsiballZ_copy.py'
Oct 07 21:35:49 compute-0 sudo[142738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:49 compute-0 python3.9[142740]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872948.5360518-1088-52240144120246/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:49 compute-0 sudo[142738]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:49 compute-0 sshd-session[140713]: Received disconnect from 193.46.255.244 port 47906:11:  [preauth]
Oct 07 21:35:49 compute-0 sshd-session[140713]: Disconnected from authenticating user root 193.46.255.244 port 47906 [preauth]
Oct 07 21:35:49 compute-0 sshd-session[140713]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 21:35:50 compute-0 sudo[142890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iweastjholzrbxsjupbyrxmyhedwwnfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872949.8179636-1088-90911080933054/AnsiballZ_stat.py'
Oct 07 21:35:50 compute-0 sudo[142890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:50 compute-0 python3.9[142892]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:50 compute-0 sudo[142890]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:50 compute-0 sudo[143015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqhtfdkvidrwzywraektxfvojptbymey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872949.8179636-1088-90911080933054/AnsiballZ_copy.py'
Oct 07 21:35:50 compute-0 sudo[143015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:51 compute-0 python3.9[143017]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872949.8179636-1088-90911080933054/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:51 compute-0 sudo[143015]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:51 compute-0 sudo[143167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgoxejrduhgrlpmgmnaetdgbocyyihrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872951.1965368-1088-218078382436878/AnsiballZ_stat.py'
Oct 07 21:35:51 compute-0 sudo[143167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:51 compute-0 python3.9[143169]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:51 compute-0 sudo[143167]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:52 compute-0 sudo[143292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajyovhwjrtwkovqojbjvbwnipkqxtaho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872951.1965368-1088-218078382436878/AnsiballZ_copy.py'
Oct 07 21:35:52 compute-0 sudo[143292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:52 compute-0 python3.9[143294]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872951.1965368-1088-218078382436878/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:52 compute-0 sudo[143292]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:52 compute-0 sudo[143444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvfhyjdaukasuiavbxowicvxbbsdbilf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872952.4844232-1088-214130696386428/AnsiballZ_stat.py'
Oct 07 21:35:52 compute-0 sudo[143444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:52 compute-0 python3.9[143446]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:53 compute-0 sudo[143444]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:53 compute-0 sudo[143582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlxgatcbcqdecshdvwhdfvckiujmvdvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872952.4844232-1088-214130696386428/AnsiballZ_copy.py'
Oct 07 21:35:53 compute-0 sudo[143582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:53 compute-0 podman[143541]: 2025-10-07 21:35:53.473289678 +0000 UTC m=+0.086374874 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251007)
Oct 07 21:35:53 compute-0 python3.9[143588]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872952.4844232-1088-214130696386428/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:53 compute-0 sudo[143582]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:54 compute-0 sudo[143744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuszdeczzzjaqlvdljzzjimuhkjridjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872953.8140624-1088-96377450865541/AnsiballZ_stat.py'
Oct 07 21:35:54 compute-0 sudo[143744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:54 compute-0 python3.9[143746]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:35:54 compute-0 sudo[143744]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:54 compute-0 sudo[143869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmsnyowwjtotvxdxwsaurrtqdtsayxvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872953.8140624-1088-96377450865541/AnsiballZ_copy.py'
Oct 07 21:35:54 compute-0 sudo[143869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:55 compute-0 python3.9[143871]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759872953.8140624-1088-96377450865541/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:55 compute-0 sudo[143869]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:56 compute-0 sudo[144023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkuasfjbsehelrowubmoygeabfoytshl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872956.6107218-1314-221568797034978/AnsiballZ_command.py'
Oct 07 21:35:56 compute-0 sudo[144023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:57 compute-0 python3.9[144025]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 07 21:35:57 compute-0 sudo[144023]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:57 compute-0 sudo[144176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovtyzpvylsjrfprdmxbdoooupacbzbgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872957.4949896-1332-105741315331191/AnsiballZ_file.py'
Oct 07 21:35:57 compute-0 sudo[144176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:58 compute-0 python3.9[144178]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:58 compute-0 sudo[144176]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:58 compute-0 sudo[144328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swtmgrpjpddrdjuxossszxorbcsahxeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872958.1856148-1332-31698314761105/AnsiballZ_file.py'
Oct 07 21:35:58 compute-0 sudo[144328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:58 compute-0 python3.9[144330]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:58 compute-0 sudo[144328]: pam_unix(sudo:session): session closed for user root
Oct 07 21:35:59 compute-0 sudo[144480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zycurqyipidzxhjeverhwwcaswzrstid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872958.94554-1332-218404625249912/AnsiballZ_file.py'
Oct 07 21:35:59 compute-0 sudo[144480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:35:59 compute-0 podman[144482]: 2025-10-07 21:35:59.41008505 +0000 UTC m=+0.086371803 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:35:59 compute-0 python3.9[144483]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:35:59 compute-0 sudo[144480]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:00 compute-0 sudo[144651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jszybidbjfcaopodkfxijrwnxlruwsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872959.7359524-1332-270458155380512/AnsiballZ_file.py'
Oct 07 21:36:00 compute-0 sudo[144651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:00 compute-0 python3.9[144653]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:00 compute-0 sudo[144651]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:00 compute-0 sudo[144803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwznhqvsvdusartgkjgnzqbkdgiddqyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872960.4940352-1332-149221634219377/AnsiballZ_file.py'
Oct 07 21:36:00 compute-0 sudo[144803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:00 compute-0 unix_chkpwd[144806]: password check failed for user (operator)
Oct 07 21:36:00 compute-0 sshd-session[143896]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=operator
Oct 07 21:36:01 compute-0 python3.9[144805]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:01 compute-0 sudo[144803]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:01 compute-0 sudo[144956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdpxwmprarfduefacqpffqrshsnlehrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872961.2012942-1332-259082455190525/AnsiballZ_file.py'
Oct 07 21:36:01 compute-0 sudo[144956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:01 compute-0 python3.9[144958]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:01 compute-0 sudo[144956]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:02 compute-0 sshd-session[143896]: Failed password for operator from 116.110.151.5 port 39110 ssh2
Oct 07 21:36:02 compute-0 sudo[145108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkikoeolgtdtntgelsruoaaytkksorkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872961.916733-1332-85739143590510/AnsiballZ_file.py'
Oct 07 21:36:02 compute-0 sudo[145108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:02 compute-0 python3.9[145110]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:02 compute-0 sudo[145108]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:03 compute-0 sudo[145260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdgqagghpdhdbulcgrexrikklmyykojr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872962.6778324-1332-28250341613518/AnsiballZ_file.py'
Oct 07 21:36:03 compute-0 sudo[145260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:03 compute-0 sshd-session[143896]: Connection closed by authenticating user operator 116.110.151.5 port 39110 [preauth]
Oct 07 21:36:03 compute-0 python3.9[145262]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:03 compute-0 sudo[145260]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:03 compute-0 sudo[145412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfglfwydjkchdpsaibnazhnbhbakrmvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872963.3721068-1332-114775519775898/AnsiballZ_file.py'
Oct 07 21:36:03 compute-0 sudo[145412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:03 compute-0 python3.9[145414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:03 compute-0 sudo[145412]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:04 compute-0 sudo[145564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqmgrauuxlpcskzuqsgyyvbfuolfemsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872964.0813222-1332-180406777213464/AnsiballZ_file.py'
Oct 07 21:36:04 compute-0 sudo[145564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:04 compute-0 python3.9[145566]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:04 compute-0 sudo[145564]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:05 compute-0 sudo[145716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efrzpdyamusfoouhoqjrmzsvodauccws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872964.860975-1332-71418510174383/AnsiballZ_file.py'
Oct 07 21:36:05 compute-0 sudo[145716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:05 compute-0 python3.9[145718]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:05 compute-0 sudo[145716]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:05 compute-0 sudo[145868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjmjxvmxovugrcjjtdczbeuuyzlnycmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872965.6294954-1332-119092402816517/AnsiballZ_file.py'
Oct 07 21:36:05 compute-0 sudo[145868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:06 compute-0 python3.9[145870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:06 compute-0 sudo[145868]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:06 compute-0 sudo[146020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oddyzavagrkjnbrdrrmwuxdrxewgmzil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872966.3549323-1332-279534102448737/AnsiballZ_file.py'
Oct 07 21:36:06 compute-0 sudo[146020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:06 compute-0 python3.9[146022]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:07 compute-0 sudo[146020]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:07 compute-0 sudo[146172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vysutjevptanaamvujoqqbgnqgnhjaqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872967.1627965-1332-56703535043610/AnsiballZ_file.py'
Oct 07 21:36:07 compute-0 sudo[146172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:07 compute-0 python3.9[146174]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:07 compute-0 sudo[146172]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:08 compute-0 sudo[146324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lublkrpfukfxnkwuotebazuxlabbmqtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872968.284751-1530-81509227876480/AnsiballZ_stat.py'
Oct 07 21:36:08 compute-0 sudo[146324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:08 compute-0 python3.9[146326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:08 compute-0 sudo[146324]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:09 compute-0 sudo[146447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lffwfxuytgmvtixrrporkiyyxpkusvnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872968.284751-1530-81509227876480/AnsiballZ_copy.py'
Oct 07 21:36:09 compute-0 sudo[146447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:09 compute-0 python3.9[146449]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872968.284751-1530-81509227876480/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:09 compute-0 sudo[146447]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:10 compute-0 sudo[146599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oildtolzvumazsssoqailovnaeumnlkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872969.679742-1530-43437420921214/AnsiballZ_stat.py'
Oct 07 21:36:10 compute-0 sudo[146599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:10 compute-0 python3.9[146601]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:10 compute-0 sudo[146599]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:10 compute-0 sudo[146722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezmlyuhkzyfuyygmffwnmrjmdffelfwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872969.679742-1530-43437420921214/AnsiballZ_copy.py'
Oct 07 21:36:10 compute-0 sudo[146722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:11 compute-0 python3.9[146724]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872969.679742-1530-43437420921214/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:11 compute-0 sudo[146722]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:11 compute-0 sudo[146874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyrlktgimxcyfnosoykjsjomyvygqlyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872971.271009-1530-256073091157385/AnsiballZ_stat.py'
Oct 07 21:36:11 compute-0 sudo[146874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:11 compute-0 python3.9[146876]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:11 compute-0 sudo[146874]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:12 compute-0 sudo[146997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lddjbvulmtlkjykkightthybgdvhsonf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872971.271009-1530-256073091157385/AnsiballZ_copy.py'
Oct 07 21:36:12 compute-0 sudo[146997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:12 compute-0 python3.9[146999]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872971.271009-1530-256073091157385/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:12 compute-0 sudo[146997]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:13 compute-0 sudo[147149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhzvdxqpheqztbmgakfmtayhiunlulzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872972.7592685-1530-75291815536088/AnsiballZ_stat.py'
Oct 07 21:36:13 compute-0 sudo[147149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:13 compute-0 python3.9[147151]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:13 compute-0 sudo[147149]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:13 compute-0 sudo[147272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlijznulcnobhirbwrvxzyrjbfvqwkhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872972.7592685-1530-75291815536088/AnsiballZ_copy.py'
Oct 07 21:36:13 compute-0 sudo[147272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:13 compute-0 python3.9[147274]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872972.7592685-1530-75291815536088/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:13 compute-0 sudo[147272]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:14 compute-0 sudo[147424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyxlosfqszzzvasxitoitshffdcutdvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872973.9879956-1530-176742270866321/AnsiballZ_stat.py'
Oct 07 21:36:14 compute-0 sudo[147424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:14 compute-0 python3.9[147426]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:14 compute-0 sudo[147424]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:15 compute-0 sudo[147547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knylspsxadydnsfcpddtwhdyvboofhdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872973.9879956-1530-176742270866321/AnsiballZ_copy.py'
Oct 07 21:36:15 compute-0 sudo[147547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:15 compute-0 python3.9[147549]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872973.9879956-1530-176742270866321/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:15 compute-0 sudo[147547]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:15 compute-0 sudo[147699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faufkkdgzrwffpqootryyajyknvclshv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872975.4969807-1530-140081960796762/AnsiballZ_stat.py'
Oct 07 21:36:15 compute-0 sudo[147699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:16 compute-0 python3.9[147701]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:16 compute-0 sudo[147699]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:16 compute-0 sudo[147822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqeqaumxhtirsrvozfigtrhunzvpesfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872975.4969807-1530-140081960796762/AnsiballZ_copy.py'
Oct 07 21:36:16 compute-0 sudo[147822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:16 compute-0 python3.9[147824]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872975.4969807-1530-140081960796762/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:16 compute-0 sudo[147822]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:17 compute-0 sudo[147974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpvimakaonxvjjphulobhwtqcoafpxof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872976.9586952-1530-192311995167721/AnsiballZ_stat.py'
Oct 07 21:36:17 compute-0 sudo[147974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:17 compute-0 python3.9[147976]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:17 compute-0 sudo[147974]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:17 compute-0 sudo[148097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxdoubcivkfqghjumkzijwyjaazshuca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872976.9586952-1530-192311995167721/AnsiballZ_copy.py'
Oct 07 21:36:17 compute-0 sudo[148097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:18 compute-0 python3.9[148099]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872976.9586952-1530-192311995167721/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:18 compute-0 sudo[148097]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:18 compute-0 sudo[148249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvkkilyxniyqsfhbnuzoevjskwmbwuex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872978.2957187-1530-231419546455503/AnsiballZ_stat.py'
Oct 07 21:36:18 compute-0 sudo[148249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:18 compute-0 python3.9[148251]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:18 compute-0 sudo[148249]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:19 compute-0 sudo[148372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oakojwfthnqexmtzmgojqkvdppsbtfhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872978.2957187-1530-231419546455503/AnsiballZ_copy.py'
Oct 07 21:36:19 compute-0 sudo[148372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:19 compute-0 python3.9[148374]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872978.2957187-1530-231419546455503/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:19 compute-0 sudo[148372]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:20 compute-0 sudo[148524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phfvburqvwdgvtfvkehwdczmbpmmgtre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872979.725025-1530-7428623524161/AnsiballZ_stat.py'
Oct 07 21:36:20 compute-0 sudo[148524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:20 compute-0 python3.9[148526]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:20 compute-0 sudo[148524]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:20 compute-0 sudo[148647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiydrghehiiedxfawkngmaryuxohmhrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872979.725025-1530-7428623524161/AnsiballZ_copy.py'
Oct 07 21:36:20 compute-0 sudo[148647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:20 compute-0 python3.9[148649]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872979.725025-1530-7428623524161/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:20 compute-0 sudo[148647]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:21 compute-0 sudo[148799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qofnpldpunixglvwxjsvnihkkvuyqeue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872981.086037-1530-42915499271125/AnsiballZ_stat.py'
Oct 07 21:36:21 compute-0 sudo[148799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:21 compute-0 python3.9[148801]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:21 compute-0 sudo[148799]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:22 compute-0 sudo[148922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrrduwhkqcimcupltvnpcvyqjjpbveyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872981.086037-1530-42915499271125/AnsiballZ_copy.py'
Oct 07 21:36:22 compute-0 sudo[148922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:22 compute-0 python3.9[148924]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872981.086037-1530-42915499271125/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:22 compute-0 sudo[148922]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:22 compute-0 sudo[149074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxrxmhcbahxaxdsorlbvizfzyhavdixt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872982.4615154-1530-36036865402146/AnsiballZ_stat.py'
Oct 07 21:36:22 compute-0 sudo[149074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:23 compute-0 python3.9[149076]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:23 compute-0 sudo[149074]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:23 compute-0 sudo[149197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fznvmcotwjyimelwxcjfoltflwslgsjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872982.4615154-1530-36036865402146/AnsiballZ_copy.py'
Oct 07 21:36:23 compute-0 sudo[149197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:23 compute-0 python3.9[149199]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872982.4615154-1530-36036865402146/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:23 compute-0 sudo[149197]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:23 compute-0 podman[149206]: 2025-10-07 21:36:23.896716276 +0000 UTC m=+0.123281254 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 07 21:36:24 compute-0 sudo[149376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hidvuwnxxoanwxpamhgsoiiunetpbgyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872983.8533752-1530-156246287137329/AnsiballZ_stat.py'
Oct 07 21:36:24 compute-0 sudo[149376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:24 compute-0 python3.9[149378]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:24 compute-0 sudo[149376]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:24 compute-0 sudo[149499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtzipyvwopqmrsgneophonrgnhopluem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872983.8533752-1530-156246287137329/AnsiballZ_copy.py'
Oct 07 21:36:24 compute-0 sudo[149499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:25 compute-0 python3.9[149501]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872983.8533752-1530-156246287137329/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:25 compute-0 sudo[149499]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:25 compute-0 sudo[149651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uulvbkrehbzkyviwgkawmjdxsmzcexbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872985.1845205-1530-239484148914054/AnsiballZ_stat.py'
Oct 07 21:36:25 compute-0 sudo[149651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:36:25.570 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:36:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:36:25.570 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:36:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:36:25.571 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:36:25 compute-0 python3.9[149653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:25 compute-0 sudo[149651]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:26 compute-0 sudo[149775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oasnhjejhermlqqhyekbddjdhazvqdlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872985.1845205-1530-239484148914054/AnsiballZ_copy.py'
Oct 07 21:36:26 compute-0 sudo[149775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:26 compute-0 python3.9[149777]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872985.1845205-1530-239484148914054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:26 compute-0 sudo[149775]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:26 compute-0 sudo[149927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwvzmhjezcwykcqnigdpvxpurpslgzit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872986.5348406-1530-158865280129773/AnsiballZ_stat.py'
Oct 07 21:36:26 compute-0 sudo[149927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:27 compute-0 python3.9[149929]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:27 compute-0 sudo[149927]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:27 compute-0 sudo[150050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agtkrexxmzytoxesgkqszjwiepahukcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872986.5348406-1530-158865280129773/AnsiballZ_copy.py'
Oct 07 21:36:27 compute-0 sudo[150050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:27 compute-0 python3.9[150052]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759872986.5348406-1530-158865280129773/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:27 compute-0 sudo[150050]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:28 compute-0 python3.9[150202]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:36:29 compute-0 sudo[150355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ochnryupekqlmfmlpmetoldenikkgycc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872988.9812882-1942-101985888904206/AnsiballZ_seboolean.py'
Oct 07 21:36:29 compute-0 sudo[150355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:29 compute-0 podman[150357]: 2025-10-07 21:36:29.592799618 +0000 UTC m=+0.074699788 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 21:36:29 compute-0 python3.9[150358]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 07 21:36:31 compute-0 sudo[150355]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:31 compute-0 sudo[150530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqfnxnxerhndrceokdlhicmtxfwvjxiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872991.4393196-1958-281214597203818/AnsiballZ_copy.py'
Oct 07 21:36:31 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct 07 21:36:31 compute-0 sudo[150530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:32 compute-0 python3.9[150532]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:32 compute-0 sudo[150530]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:32 compute-0 sudo[150682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sirpqwniebfztynsxwgectyckhdwactz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872992.1820347-1958-427229125616/AnsiballZ_copy.py'
Oct 07 21:36:32 compute-0 sudo[150682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:32 compute-0 python3.9[150684]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:32 compute-0 sudo[150682]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:33 compute-0 sudo[150834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzfogbhspmpkwjlwltmowxppxlqptmbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872992.8968883-1958-185609495751898/AnsiballZ_copy.py'
Oct 07 21:36:33 compute-0 sudo[150834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:33 compute-0 python3.9[150836]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:33 compute-0 sudo[150834]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:34 compute-0 sudo[150986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbchqrbnrbiggprzdgbwmqhkulknbrli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872993.7134173-1958-21775152497039/AnsiballZ_copy.py'
Oct 07 21:36:34 compute-0 sudo[150986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:34 compute-0 python3.9[150988]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:34 compute-0 sudo[150986]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:34 compute-0 sudo[151138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtojdakaovnhzwsdmredajkvogngynxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872994.5075793-1958-158632919826797/AnsiballZ_copy.py'
Oct 07 21:36:34 compute-0 sudo[151138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:35 compute-0 python3.9[151140]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:35 compute-0 sudo[151138]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:35 compute-0 sudo[151290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpwexatnmsmmpeswfhilmneqmfmvkcxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872995.3612752-2030-68408598308425/AnsiballZ_copy.py'
Oct 07 21:36:35 compute-0 sudo[151290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:35 compute-0 python3.9[151292]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:35 compute-0 sudo[151290]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:36 compute-0 sudo[151442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uettbnieqfdhxeexiwoytmrbxeciuzjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872996.1064694-2030-181165218792555/AnsiballZ_copy.py'
Oct 07 21:36:36 compute-0 sudo[151442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:36 compute-0 python3.9[151444]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:36 compute-0 sudo[151442]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:37 compute-0 sudo[151594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quyvraukdunuiaukrpsmbvtkicxmdktp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872996.8431165-2030-236291896808932/AnsiballZ_copy.py'
Oct 07 21:36:37 compute-0 sudo[151594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:37 compute-0 python3.9[151596]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:37 compute-0 sudo[151594]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:37 compute-0 sudo[151746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbadnakfsikuspbnuanfyabikwidmgyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872997.5881863-2030-230712773503060/AnsiballZ_copy.py'
Oct 07 21:36:37 compute-0 sudo[151746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:38 compute-0 python3.9[151748]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:38 compute-0 sudo[151746]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:38 compute-0 sudo[151898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrnjbqhlpotyijkynyjwgzkcoldswkeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872998.2872176-2030-114258243326215/AnsiballZ_copy.py'
Oct 07 21:36:38 compute-0 sudo[151898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:38 compute-0 python3.9[151900]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:38 compute-0 sudo[151898]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:39 compute-0 sudo[152050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpdotspcudxuxbvhjwhmmfyicpmkvygb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759872999.209889-2102-93136500535040/AnsiballZ_systemd.py'
Oct 07 21:36:39 compute-0 sudo[152050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:39 compute-0 python3.9[152052]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:36:39 compute-0 systemd[1]: Reloading.
Oct 07 21:36:40 compute-0 systemd-rc-local-generator[152079]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:36:40 compute-0 systemd-sysv-generator[152085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:36:40 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Oct 07 21:36:40 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Oct 07 21:36:40 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 07 21:36:40 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 07 21:36:40 compute-0 systemd[1]: Starting libvirt logging daemon...
Oct 07 21:36:40 compute-0 systemd[1]: Started libvirt logging daemon.
Oct 07 21:36:40 compute-0 sudo[152050]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:41 compute-0 sudo[152243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukmqzrflpqxhoktrexpyowiqlvrrqkyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873000.6045508-2102-212546843034982/AnsiballZ_systemd.py'
Oct 07 21:36:41 compute-0 sudo[152243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:41 compute-0 python3.9[152245]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:36:41 compute-0 systemd[1]: Reloading.
Oct 07 21:36:41 compute-0 systemd-sysv-generator[152278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:36:41 compute-0 systemd-rc-local-generator[152274]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:36:41 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 07 21:36:41 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 07 21:36:41 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 07 21:36:41 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 07 21:36:41 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 07 21:36:41 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 07 21:36:41 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 07 21:36:41 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 07 21:36:41 compute-0 sudo[152243]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:42 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 07 21:36:42 compute-0 sudo[152459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrbfcvigwwpxyxhyccovpmfwojwsatsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873001.9581587-2102-66054714128740/AnsiballZ_systemd.py'
Oct 07 21:36:42 compute-0 sudo[152459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:42 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 07 21:36:42 compute-0 python3.9[152461]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:36:42 compute-0 systemd[1]: Reloading.
Oct 07 21:36:42 compute-0 systemd-rc-local-generator[152489]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:36:42 compute-0 systemd-sysv-generator[152495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:36:42 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 07 21:36:42 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 07 21:36:42 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 07 21:36:42 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 07 21:36:42 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 07 21:36:42 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 07 21:36:42 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 21:36:42 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 21:36:43 compute-0 sudo[152459]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:43 compute-0 sudo[152676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdselnpgpxbxqhggtvwxhlhmpqdrqtnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873003.2902832-2102-3669784994403/AnsiballZ_systemd.py'
Oct 07 21:36:43 compute-0 sudo[152676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:43 compute-0 setroubleshoot[152385]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 548344e1-7adc-43c4-bf07-0df353752cd6
Oct 07 21:36:43 compute-0 setroubleshoot[152385]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 07 21:36:43 compute-0 setroubleshoot[152385]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 548344e1-7adc-43c4-bf07-0df353752cd6
Oct 07 21:36:43 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:36:43 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:36:43 compute-0 setroubleshoot[152385]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 07 21:36:44 compute-0 python3.9[152678]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:36:44 compute-0 systemd[1]: Reloading.
Oct 07 21:36:44 compute-0 systemd-rc-local-generator[152707]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:36:44 compute-0 systemd-sysv-generator[152713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:36:44 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Oct 07 21:36:44 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 07 21:36:44 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 07 21:36:44 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 07 21:36:44 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 07 21:36:44 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 07 21:36:44 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 07 21:36:44 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 07 21:36:44 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 07 21:36:44 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 07 21:36:44 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 07 21:36:44 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 07 21:36:44 compute-0 sudo[152676]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:45 compute-0 sudo[152890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvxqtmjwxukmlornysajgjswkolxmeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873004.687224-2102-278070885307714/AnsiballZ_systemd.py'
Oct 07 21:36:45 compute-0 sudo[152890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:45 compute-0 python3.9[152892]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:36:45 compute-0 systemd[1]: Reloading.
Oct 07 21:36:45 compute-0 systemd-rc-local-generator[152918]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:36:45 compute-0 systemd-sysv-generator[152923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:36:45 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Oct 07 21:36:45 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Oct 07 21:36:45 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 07 21:36:45 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 07 21:36:45 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 07 21:36:45 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 07 21:36:45 compute-0 systemd[1]: Starting libvirt secret daemon...
Oct 07 21:36:45 compute-0 systemd[1]: Started libvirt secret daemon.
Oct 07 21:36:45 compute-0 sudo[152890]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:46 compute-0 sudo[153100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhduvcaoyqeabruehfmroxylfiidzlhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873006.1529956-2176-212803141415557/AnsiballZ_file.py'
Oct 07 21:36:46 compute-0 sudo[153100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:46 compute-0 python3.9[153102]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:46 compute-0 sudo[153100]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:47 compute-0 sudo[153252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlwwekjsqgxjqskfyuwmbbicxmuxotsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873006.9959805-2192-109956500447222/AnsiballZ_find.py'
Oct 07 21:36:47 compute-0 sudo[153252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:47 compute-0 python3.9[153254]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 07 21:36:47 compute-0 sudo[153252]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:48 compute-0 sudo[153404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxzjpdztzxbmfgdtknrzipkvrkfhwulg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873008.1427858-2220-13009549321210/AnsiballZ_stat.py'
Oct 07 21:36:48 compute-0 sudo[153404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:48 compute-0 python3.9[153406]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:48 compute-0 sudo[153404]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:49 compute-0 sudo[153527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sihzkmrfqzagdicyzoslslpauyceusnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873008.1427858-2220-13009549321210/AnsiballZ_copy.py'
Oct 07 21:36:49 compute-0 sudo[153527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:49 compute-0 python3.9[153529]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873008.1427858-2220-13009549321210/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:49 compute-0 sudo[153527]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:50 compute-0 sudo[153679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lubgnbbfdiihbpiojprqqlgzyepxbyzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873009.7623184-2252-213438853080339/AnsiballZ_file.py'
Oct 07 21:36:50 compute-0 sudo[153679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:50 compute-0 python3.9[153681]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:50 compute-0 sudo[153679]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:50 compute-0 sudo[153831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagjcjigahoigahlmsoyjekdqsxmzfte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873010.6018565-2268-103293396209503/AnsiballZ_stat.py'
Oct 07 21:36:50 compute-0 sudo[153831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:51 compute-0 python3.9[153833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:51 compute-0 sudo[153831]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:51 compute-0 sudo[153909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfjmjbuhbiebvmbuqymoiijfkcwppnsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873010.6018565-2268-103293396209503/AnsiballZ_file.py'
Oct 07 21:36:51 compute-0 sudo[153909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:51 compute-0 python3.9[153911]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:51 compute-0 sudo[153909]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:52 compute-0 sudo[154061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvofgymbrrqbkahrmbkzacqzkxmcljcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873011.9441192-2292-263116041435567/AnsiballZ_stat.py'
Oct 07 21:36:52 compute-0 sudo[154061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:52 compute-0 python3.9[154063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:52 compute-0 sudo[154061]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:52 compute-0 sudo[154139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcybyejngsutztsulianeduqostkdvbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873011.9441192-2292-263116041435567/AnsiballZ_file.py'
Oct 07 21:36:52 compute-0 sudo[154139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:53 compute-0 python3.9[154141]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._boo47ay recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:53 compute-0 sudo[154139]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:53 compute-0 sudo[154291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phnnioinfgxrlvkzxhmrxzncnzwtthrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873013.3875632-2316-179190437700877/AnsiballZ_stat.py'
Oct 07 21:36:53 compute-0 sudo[154291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:53 compute-0 python3.9[154293]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:53 compute-0 sudo[154291]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:53 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 07 21:36:53 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.050s CPU time.
Oct 07 21:36:54 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 07 21:36:54 compute-0 podman[154319]: 2025-10-07 21:36:54.124279274 +0000 UTC m=+0.115319804 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct 07 21:36:54 compute-0 sudo[154393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwgaeamecbduitlzwwqapghxwzcebhny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873013.3875632-2316-179190437700877/AnsiballZ_file.py'
Oct 07 21:36:54 compute-0 sudo[154393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:54 compute-0 python3.9[154398]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:54 compute-0 sudo[154393]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:55 compute-0 sudo[154548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcbdrksxixffroaybdgfuutamlqbajng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873014.761939-2342-238062679098805/AnsiballZ_command.py'
Oct 07 21:36:55 compute-0 sudo[154548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:55 compute-0 python3.9[154550]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:36:55 compute-0 sudo[154548]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:56 compute-0 sudo[154701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyrcpcuhupnfvnyzeksrllhxsavlidgo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873015.6001098-2358-145711381726289/AnsiballZ_edpm_nftables_from_files.py'
Oct 07 21:36:56 compute-0 sudo[154701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:56 compute-0 python3[154703]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 07 21:36:56 compute-0 sudo[154701]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:56 compute-0 sudo[154854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utadafvfxntehwvjepofjciawknrazjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873016.5459998-2374-18709612137677/AnsiballZ_stat.py'
Oct 07 21:36:56 compute-0 sudo[154854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:57 compute-0 python3.9[154856]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:57 compute-0 sudo[154854]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:57 compute-0 sudo[154932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxnubbjlllerzkvvwrktveescmmkrckd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873016.5459998-2374-18709612137677/AnsiballZ_file.py'
Oct 07 21:36:57 compute-0 sudo[154932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:57 compute-0 python3.9[154934]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:57 compute-0 sudo[154932]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:58 compute-0 sudo[155085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwgvuppmvilzoyqneldxiqippxnsvmfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873017.9137778-2398-173320060056637/AnsiballZ_stat.py'
Oct 07 21:36:58 compute-0 sudo[155085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:58 compute-0 python3.9[155087]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:58 compute-0 sudo[155085]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:58 compute-0 sudo[155163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkqcqoznaqimkshifqnxawasjdvnkgpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873017.9137778-2398-173320060056637/AnsiballZ_file.py'
Oct 07 21:36:58 compute-0 sudo[155163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:59 compute-0 python3.9[155165]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:36:59 compute-0 sudo[155163]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:59 compute-0 sudo[155315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpqiuybbhynqzatdgbqhbbryomlsoxyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873019.22554-2422-268231879125954/AnsiballZ_stat.py'
Oct 07 21:36:59 compute-0 sudo[155315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:36:59 compute-0 python3.9[155317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:36:59 compute-0 sudo[155315]: pam_unix(sudo:session): session closed for user root
Oct 07 21:36:59 compute-0 podman[155318]: 2025-10-07 21:36:59.851998716 +0000 UTC m=+0.087972809 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 07 21:37:00 compute-0 sudo[155412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzccnslcjmyshzibfzujrktjwypellsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873019.22554-2422-268231879125954/AnsiballZ_file.py'
Oct 07 21:37:00 compute-0 sudo[155412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:00 compute-0 python3.9[155414]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:00 compute-0 sudo[155412]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:00 compute-0 unix_chkpwd[155549]: password check failed for user (root)
Oct 07 21:37:00 compute-0 sshd-session[154803]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=27.79.44.171  user=root
Oct 07 21:37:00 compute-0 sudo[155565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsgyjjtfcvpuaavcrwkvqqlnnwwjgbde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873020.5874639-2446-213029448083846/AnsiballZ_stat.py'
Oct 07 21:37:00 compute-0 sudo[155565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:01 compute-0 python3.9[155567]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:01 compute-0 sudo[155565]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:01 compute-0 sudo[155643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzidamwftiotvfaqsexrozvdroxhahny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873020.5874639-2446-213029448083846/AnsiballZ_file.py'
Oct 07 21:37:01 compute-0 sudo[155643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:01 compute-0 python3.9[155645]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:01 compute-0 sudo[155643]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:02 compute-0 sudo[155795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfpzepobnmyrblilcaxgggqnvowunqxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873021.9914198-2470-103289759052284/AnsiballZ_stat.py'
Oct 07 21:37:02 compute-0 sudo[155795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:02 compute-0 python3.9[155797]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:02 compute-0 sudo[155795]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:02 compute-0 sshd-session[154803]: Failed password for root from 27.79.44.171 port 46850 ssh2
Oct 07 21:37:03 compute-0 sudo[155920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxukairbprgxzqzqalezcuishaxacvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873021.9914198-2470-103289759052284/AnsiballZ_copy.py'
Oct 07 21:37:03 compute-0 sudo[155920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:03 compute-0 python3.9[155922]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759873021.9914198-2470-103289759052284/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:03 compute-0 sudo[155920]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:03 compute-0 sudo[156072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fettbilzuoizqcmtcwpvefqotdkcbfao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873023.5821154-2500-175088343105211/AnsiballZ_file.py'
Oct 07 21:37:03 compute-0 sudo[156072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:04 compute-0 python3.9[156074]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:04 compute-0 sudo[156072]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:04 compute-0 sudo[156224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iorhxyqygikcmsdpswwfzssqawynnyqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873024.3545153-2516-132832978428795/AnsiballZ_command.py'
Oct 07 21:37:04 compute-0 sudo[156224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:04 compute-0 python3.9[156226]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:37:04 compute-0 sudo[156224]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:05 compute-0 sudo[156379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjgempevxzbucwfaksqiedayymxugmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873025.180968-2532-76165399615662/AnsiballZ_blockinfile.py'
Oct 07 21:37:05 compute-0 sudo[156379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:05 compute-0 python3.9[156381]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:05 compute-0 sudo[156379]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:06 compute-0 sudo[156531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxqhmltcpimidaosrujvnnrihkyzebok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873026.1713305-2550-52671317971650/AnsiballZ_command.py'
Oct 07 21:37:06 compute-0 sudo[156531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:06 compute-0 python3.9[156533]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:37:06 compute-0 sudo[156531]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:07 compute-0 sudo[156684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkulkecvgvwwbsxxkflrmdelklsoqewx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873026.9575038-2566-130575066755135/AnsiballZ_stat.py'
Oct 07 21:37:07 compute-0 sudo[156684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:07 compute-0 python3.9[156686]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:37:07 compute-0 sudo[156684]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:08 compute-0 sudo[156838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkqegtwcddxsksudkclfyhyqmeydxboe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873027.842057-2582-82635332188288/AnsiballZ_command.py'
Oct 07 21:37:08 compute-0 sudo[156838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:08 compute-0 python3.9[156840]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:37:08 compute-0 sudo[156838]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:08 compute-0 sshd-session[154803]: Connection closed by authenticating user root 27.79.44.171 port 46850 [preauth]
Oct 07 21:37:09 compute-0 sudo[156994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wglmoeegrrhnhesfvibinkarifibdrkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873028.6544733-2598-96748078645300/AnsiballZ_file.py'
Oct 07 21:37:09 compute-0 sudo[156994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:09 compute-0 python3.9[156996]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:09 compute-0 sudo[156994]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:09 compute-0 sudo[157146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zolcxoyrsmiiseuqxexazikenxvffyzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873029.4476779-2614-201607102667490/AnsiballZ_stat.py'
Oct 07 21:37:09 compute-0 sudo[157146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:10 compute-0 python3.9[157148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:10 compute-0 sudo[157146]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:10 compute-0 sudo[157269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxeoqakwhghpcoquajcwzoasxpvpgnqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873029.4476779-2614-201607102667490/AnsiballZ_copy.py'
Oct 07 21:37:10 compute-0 sudo[157269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:10 compute-0 python3.9[157271]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873029.4476779-2614-201607102667490/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:10 compute-0 sudo[157269]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:11 compute-0 sudo[157421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjrhsksvaelmnniebqvmnniiwsnrsxaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873031.010376-2644-258830559882873/AnsiballZ_stat.py'
Oct 07 21:37:11 compute-0 sudo[157421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:11 compute-0 python3.9[157423]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:11 compute-0 sudo[157421]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:11 compute-0 sudo[157544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuokczyoqnygdsjzvjvmokgkigzzkexi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873031.010376-2644-258830559882873/AnsiballZ_copy.py'
Oct 07 21:37:11 compute-0 sudo[157544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:12 compute-0 python3.9[157546]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873031.010376-2644-258830559882873/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:12 compute-0 sudo[157544]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:12 compute-0 sudo[157696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpkdpzteqcvavnuuywjzewxczpmlznpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873032.5162597-2674-139186266408428/AnsiballZ_stat.py'
Oct 07 21:37:12 compute-0 sudo[157696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:13 compute-0 python3.9[157698]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:13 compute-0 sudo[157696]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:13 compute-0 sudo[157819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstzhakrxxbmuyjtwkuqorvfhvizvlys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873032.5162597-2674-139186266408428/AnsiballZ_copy.py'
Oct 07 21:37:13 compute-0 sudo[157819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:13 compute-0 python3.9[157821]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873032.5162597-2674-139186266408428/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:13 compute-0 sudo[157819]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:14 compute-0 sudo[157971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrovkkscarjpfqksxnkitayhnxezmaad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873033.9818573-2704-28828565634910/AnsiballZ_systemd.py'
Oct 07 21:37:14 compute-0 sudo[157971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:14 compute-0 python3.9[157973]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:37:14 compute-0 systemd[1]: Reloading.
Oct 07 21:37:14 compute-0 systemd-rc-local-generator[158000]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:14 compute-0 systemd-sysv-generator[158005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:14 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Oct 07 21:37:14 compute-0 sudo[157971]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:15 compute-0 sudo[158161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igynuzphdrbtzvfsqztgnfglmabwxshw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873035.3431451-2720-235162208831032/AnsiballZ_systemd.py'
Oct 07 21:37:15 compute-0 sudo[158161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:15 compute-0 python3.9[158163]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 07 21:37:15 compute-0 systemd[1]: Reloading.
Oct 07 21:37:16 compute-0 systemd-rc-local-generator[158186]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:16 compute-0 systemd-sysv-generator[158191]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:16 compute-0 systemd[1]: Reloading.
Oct 07 21:37:16 compute-0 systemd-rc-local-generator[158227]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:16 compute-0 systemd-sysv-generator[158230]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:16 compute-0 sudo[158161]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:17 compute-0 sshd-session[103913]: Connection closed by 192.168.122.30 port 59580
Oct 07 21:37:17 compute-0 sshd-session[103910]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:37:17 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Oct 07 21:37:17 compute-0 systemd[1]: session-24.scope: Consumed 3min 45.183s CPU time.
Oct 07 21:37:17 compute-0 systemd-logind[798]: Session 24 logged out. Waiting for processes to exit.
Oct 07 21:37:17 compute-0 systemd-logind[798]: Removed session 24.
Oct 07 21:37:22 compute-0 sshd-session[158259]: Accepted publickey for zuul from 192.168.122.30 port 43450 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:37:22 compute-0 systemd-logind[798]: New session 25 of user zuul.
Oct 07 21:37:22 compute-0 systemd[1]: Started Session 25 of User zuul.
Oct 07 21:37:22 compute-0 sshd-session[158259]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:37:23 compute-0 python3.9[158414]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:37:23 compute-0 sshd-session[158304]: Invalid user support from 116.110.151.5 port 60094
Oct 07 21:37:24 compute-0 sshd-session[158304]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:37:24 compute-0 sshd-session[158304]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:37:24 compute-0 podman[158497]: 2025-10-07 21:37:24.48380532 +0000 UTC m=+0.141666547 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 07 21:37:24 compute-0 sudo[158597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uklknleenaiumekrnqflvmkhbctvlcaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873044.0370822-48-163255211467882/AnsiballZ_file.py'
Oct 07 21:37:24 compute-0 sudo[158597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:24 compute-0 python3.9[158599]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:24 compute-0 sudo[158597]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:25 compute-0 sudo[158749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmidgeuninwqjbtqgvurlwaytwctvhzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873044.9633284-48-130189100540729/AnsiballZ_file.py'
Oct 07 21:37:25 compute-0 sudo[158749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:25 compute-0 python3.9[158751]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:37:25.571 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:37:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:37:25.573 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:37:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:37:25.573 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:37:25 compute-0 sudo[158749]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:26 compute-0 sudo[158902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujjvssobsgojngbuuekxoifdzbikklqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873045.7179232-48-63504657938796/AnsiballZ_file.py'
Oct 07 21:37:26 compute-0 sudo[158902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:26 compute-0 python3.9[158904]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:26 compute-0 sudo[158902]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:26 compute-0 sshd-session[158304]: Failed password for invalid user support from 116.110.151.5 port 60094 ssh2
Oct 07 21:37:26 compute-0 sudo[159054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znjmahbrwuwmjsxmxpwvioynloiaatuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873046.3548737-48-255652080226880/AnsiballZ_file.py'
Oct 07 21:37:26 compute-0 sudo[159054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:26 compute-0 python3.9[159056]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 07 21:37:26 compute-0 sudo[159054]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:27 compute-0 sudo[159206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zivrialepaszhpkpvvjtwvooijgsvhsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873047.191068-48-156761716160412/AnsiballZ_file.py'
Oct 07 21:37:27 compute-0 sudo[159206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:27 compute-0 python3.9[159208]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:27 compute-0 sudo[159206]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:27 compute-0 sshd-session[158304]: Connection closed by invalid user support 116.110.151.5 port 60094 [preauth]
Oct 07 21:37:28 compute-0 sudo[159358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooizuesisyxmwpuubwnslzsgldxqbzbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873048.0206134-120-199396083776069/AnsiballZ_stat.py'
Oct 07 21:37:28 compute-0 sudo[159358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:28 compute-0 python3.9[159360]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:37:28 compute-0 sudo[159358]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:29 compute-0 sudo[159512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwywmxmcmiyzlanfdnucaymhkzxrmeax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873049.058295-136-67501409952683/AnsiballZ_systemd.py'
Oct 07 21:37:29 compute-0 sudo[159512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:30 compute-0 python3.9[159514]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:37:30 compute-0 systemd[1]: Reloading.
Oct 07 21:37:30 compute-0 podman[159516]: 2025-10-07 21:37:30.172137984 +0000 UTC m=+0.060688930 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:37:30 compute-0 systemd-rc-local-generator[159562]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:30 compute-0 systemd-sysv-generator[159565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:30 compute-0 sudo[159512]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:31 compute-0 sudo[159720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuulywqapyhdavzijdoixfkpxssqfgwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873050.7056818-152-98040222042082/AnsiballZ_service_facts.py'
Oct 07 21:37:31 compute-0 sudo[159720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:31 compute-0 python3.9[159722]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:37:31 compute-0 network[159739]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:37:31 compute-0 network[159740]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:37:31 compute-0 network[159741]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:37:35 compute-0 sudo[159720]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:36 compute-0 sudo[160012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyywvythufupgamidsgmbdsgsqoicosd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873056.6194553-168-139514823146003/AnsiballZ_systemd.py'
Oct 07 21:37:37 compute-0 sudo[160012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:37 compute-0 python3.9[160014]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:37:37 compute-0 systemd[1]: Reloading.
Oct 07 21:37:37 compute-0 systemd-rc-local-generator[160042]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:37 compute-0 systemd-sysv-generator[160045]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:37 compute-0 sudo[160012]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:38 compute-0 python3.9[160202]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:37:39 compute-0 sudo[160352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgmpjvwtzjnmrmcahyoonqsxmbkswmva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873058.8781407-202-241199259187771/AnsiballZ_podman_container.py'
Oct 07 21:37:39 compute-0 sudo[160352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:39 compute-0 python3.9[160354]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 07 21:37:39 compute-0 podman[160389]: 2025-10-07 21:37:39.968548503 +0000 UTC m=+0.071238931 container create f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:37:39 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:37:39 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:37:39 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:37:39 compute-0 NetworkManager[51722]: <info>  [1759873059.9996] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Oct 07 21:37:40 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 07 21:37:40 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 07 21:37:40 compute-0 kernel: veth0: entered allmulticast mode
Oct 07 21:37:40 compute-0 kernel: veth0: entered promiscuous mode
Oct 07 21:37:40 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 07 21:37:40 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0235] device (veth0): carrier: link connected
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0241] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0248] device (podman0): carrier: link connected
Oct 07 21:37:40 compute-0 podman[160389]: 2025-10-07 21:37:39.938461017 +0000 UTC m=+0.041151495 image pull 2c0acbe8b07baed3b27d0202cd594c4edfd15616d3c28ad8374e80ebca74a2a1 38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 07 21:37:40 compute-0 systemd-udevd[160418]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:37:40 compute-0 systemd-udevd[160421]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0701] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0723] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0737] device (podman0): Activation: starting connection 'podman0' (c9f9e17b-171a-4457-8873-86b5c34c7078)
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0742] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0746] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0751] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.0754] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 07 21:37:40 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 07 21:37:40 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.1144] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.1147] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.1156] device (podman0): Activation: successful, device activated.
Oct 07 21:37:40 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 07 21:37:40 compute-0 systemd[1]: Started libpod-conmon-f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412.scope.
Oct 07 21:37:40 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:37:40 compute-0 podman[160389]: 2025-10-07 21:37:40.426310641 +0000 UTC m=+0.529001109 container init f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 21:37:40 compute-0 podman[160389]: 2025-10-07 21:37:40.437195813 +0000 UTC m=+0.539886231 container start f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 21:37:40 compute-0 podman[160389]: 2025-10-07 21:37:40.441070381 +0000 UTC m=+0.543760809 container attach f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:37:40 compute-0 iscsid_config[160546]: iqn.1994-05.com.redhat:26d7b28cfef
Oct 07 21:37:40 compute-0 systemd[1]: libpod-f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412.scope: Deactivated successfully.
Oct 07 21:37:40 compute-0 podman[160389]: 2025-10-07 21:37:40.444504106 +0000 UTC m=+0.547194534 container died f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 07 21:37:40 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 07 21:37:40 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Oct 07 21:37:40 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Oct 07 21:37:40 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 07 21:37:40 compute-0 NetworkManager[51722]: <info>  [1759873060.5212] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:37:40 compute-0 systemd[1]: run-netns-netns\x2dc284347d\x2dc15a\x2d661e\x2dfaf9\x2dd74b9514d80d.mount: Deactivated successfully.
Oct 07 21:37:40 compute-0 podman[160389]: 2025-10-07 21:37:40.941100646 +0000 UTC m=+1.043791074 container remove f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 21:37:40 compute-0 systemd[1]: libpod-conmon-f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412.scope: Deactivated successfully.
Oct 07 21:37:40 compute-0 python3.9[160354]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest /usr/sbin/iscsi-iname
Oct 07 21:37:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-d051a792038526df8c7d642806423bb3826f3d20ed3a66877e033ba2d2a242d8-merged.mount: Deactivated successfully.
Oct 07 21:37:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f909bf6068e69dc531251305beca667831b3d4ec591f8e5c262705d170cca412-userdata-shm.mount: Deactivated successfully.
Oct 07 21:37:41 compute-0 python3.9[160354]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 07 21:37:41 compute-0 sudo[160352]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:41 compute-0 sudo[160787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvljxzgpvdikdpgdxbxkoojfynlmptky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873061.3337862-218-214615748774452/AnsiballZ_stat.py'
Oct 07 21:37:41 compute-0 sudo[160787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:41 compute-0 python3.9[160789]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:41 compute-0 sudo[160787]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:42 compute-0 sudo[160910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbqpfhpcbucwqkhiiagdgsefnqbyvgqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873061.3337862-218-214615748774452/AnsiballZ_copy.py'
Oct 07 21:37:42 compute-0 sudo[160910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:42 compute-0 python3.9[160912]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873061.3337862-218-214615748774452/.source.iscsi _original_basename=.xpk0oj3s follow=False checksum=69a4b0f25ad3ff76b2c5d59fa8e21fcb19f5c8b8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:42 compute-0 sudo[160910]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:43 compute-0 sudo[161062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtuegjvmpfxdrkrkozqffjlhgcpjvfew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873063.0215852-248-192526497509526/AnsiballZ_file.py'
Oct 07 21:37:43 compute-0 sudo[161062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:43 compute-0 python3.9[161064]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:43 compute-0 sudo[161062]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:44 compute-0 python3.9[161214]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:37:45 compute-0 sudo[161366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adigsogrsbujwpufpujxxapfzesheswz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873064.7354407-282-174030348475599/AnsiballZ_lineinfile.py'
Oct 07 21:37:45 compute-0 sudo[161366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:45 compute-0 python3.9[161368]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:45 compute-0 sudo[161366]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:46 compute-0 sudo[161518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbdpnnccdzhcvcutsjrpjhafxvvltpuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873065.7342513-300-63760246239937/AnsiballZ_file.py'
Oct 07 21:37:46 compute-0 sudo[161518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:46 compute-0 python3.9[161520]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:46 compute-0 sudo[161518]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:46 compute-0 sudo[161670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydrgrlmgbuojhbqnohaxdzsaiidpshne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873066.506099-316-95840124130901/AnsiballZ_stat.py'
Oct 07 21:37:46 compute-0 sudo[161670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:47 compute-0 python3.9[161672]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:47 compute-0 sudo[161670]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:47 compute-0 sudo[161748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqjlkdycsxwghouiqabbdhtkreghwgak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873066.506099-316-95840124130901/AnsiballZ_file.py'
Oct 07 21:37:47 compute-0 sudo[161748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:47 compute-0 python3.9[161750]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:47 compute-0 sudo[161748]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:47 compute-0 sudo[161900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzdqoxlfkxnrgptwazilqakvgohjlkoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873067.6980476-316-236089240387267/AnsiballZ_stat.py'
Oct 07 21:37:47 compute-0 sudo[161900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:48 compute-0 python3.9[161902]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:48 compute-0 sudo[161900]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:48 compute-0 sudo[161978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snsooxkprajeeifqgdnnepcajzbsjltc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873067.6980476-316-236089240387267/AnsiballZ_file.py'
Oct 07 21:37:48 compute-0 sudo[161978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:48 compute-0 python3.9[161980]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:48 compute-0 sudo[161978]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:49 compute-0 sudo[162130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phibncjcplggigjysxbpgzrogijicqzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873069.0842917-362-201270459893774/AnsiballZ_file.py'
Oct 07 21:37:49 compute-0 sudo[162130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:49 compute-0 python3.9[162132]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:49 compute-0 sudo[162130]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:50 compute-0 sudo[162282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maguzufzbtbtqlrsbjgcqzjfdggdpnaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873069.916238-378-96571787222657/AnsiballZ_stat.py'
Oct 07 21:37:50 compute-0 sudo[162282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:50 compute-0 python3.9[162284]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:50 compute-0 sudo[162282]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:50 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 07 21:37:50 compute-0 sudo[162360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aomedugylbkcvcrfrlarstvookncrilx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873069.916238-378-96571787222657/AnsiballZ_file.py'
Oct 07 21:37:50 compute-0 sudo[162360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:50 compute-0 python3.9[162362]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:51 compute-0 sudo[162360]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:51 compute-0 sudo[162512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbikmklmeysmnbygfajbwmlopoctocq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873071.2560666-402-189944982590131/AnsiballZ_stat.py'
Oct 07 21:37:51 compute-0 sudo[162512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:51 compute-0 python3.9[162514]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:51 compute-0 sudo[162512]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:52 compute-0 sudo[162590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qavqssgafwhwvqedmfkkfupszekrpaag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873071.2560666-402-189944982590131/AnsiballZ_file.py'
Oct 07 21:37:52 compute-0 sudo[162590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:52 compute-0 python3.9[162592]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:52 compute-0 sudo[162590]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:53 compute-0 sudo[162742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvnvscwlgbobzvugzkhtherurcmmihri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873072.6475024-426-154138616451118/AnsiballZ_systemd.py'
Oct 07 21:37:53 compute-0 sudo[162742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:53 compute-0 python3.9[162744]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:37:53 compute-0 systemd[1]: Reloading.
Oct 07 21:37:53 compute-0 systemd-rc-local-generator[162765]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:53 compute-0 systemd-sysv-generator[162771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:53 compute-0 sudo[162742]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:54 compute-0 sudo[162931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srpyfcxncnjgkighdimiydkeiqvvyney ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873073.978424-442-77374373655987/AnsiballZ_stat.py'
Oct 07 21:37:54 compute-0 sudo[162931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:54 compute-0 python3.9[162933]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:54 compute-0 sudo[162931]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:54 compute-0 podman[162959]: 2025-10-07 21:37:54.923198673 +0000 UTC m=+0.146709321 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, config_id=ovn_controller, io.buildah.version=1.41.4)
Oct 07 21:37:54 compute-0 sudo[163035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dafjzuieompqsfaknmkvfjdobagngkog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873073.978424-442-77374373655987/AnsiballZ_file.py'
Oct 07 21:37:54 compute-0 sudo[163035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:55 compute-0 python3.9[163038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:55 compute-0 sudo[163035]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:55 compute-0 sudo[163188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncwmlqecsbwriotpjrdhwhhupwejvtie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873075.3180366-466-137178469162094/AnsiballZ_stat.py'
Oct 07 21:37:55 compute-0 sudo[163188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:55 compute-0 python3.9[163190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:55 compute-0 sudo[163188]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:56 compute-0 sudo[163266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxhgnincbnxxkwpbgjmkswteoagcqnhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873075.3180366-466-137178469162094/AnsiballZ_file.py'
Oct 07 21:37:56 compute-0 sudo[163266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:56 compute-0 python3.9[163268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:37:56 compute-0 sudo[163266]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:57 compute-0 sudo[163418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnmbxpwvjtfcxemdgmoqqupshfuuphyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873076.7081852-490-211985599640492/AnsiballZ_systemd.py'
Oct 07 21:37:57 compute-0 sudo[163418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:57 compute-0 python3.9[163420]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:37:57 compute-0 systemd[1]: Reloading.
Oct 07 21:37:57 compute-0 systemd-rc-local-generator[163444]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:37:57 compute-0 systemd-sysv-generator[163449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:37:57 compute-0 systemd[1]: Starting Create netns directory...
Oct 07 21:37:57 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 07 21:37:57 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 07 21:37:57 compute-0 systemd[1]: Finished Create netns directory.
Oct 07 21:37:57 compute-0 sudo[163418]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:58 compute-0 sudo[163613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqmqrbkxrsjzhaxqpcwpunfsshchaayi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873078.3097987-510-27797778929505/AnsiballZ_file.py'
Oct 07 21:37:58 compute-0 sudo[163613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:58 compute-0 python3.9[163615]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:37:58 compute-0 sudo[163613]: pam_unix(sudo:session): session closed for user root
Oct 07 21:37:59 compute-0 unix_chkpwd[163715]: password check failed for user (sync)
Oct 07 21:37:59 compute-0 sshd-session[163461]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=sync
Oct 07 21:37:59 compute-0 sudo[163766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftjhiqeddqgbvsxcigphejvamotxtznh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873079.159384-526-133095701086067/AnsiballZ_stat.py'
Oct 07 21:37:59 compute-0 sudo[163766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:37:59 compute-0 python3.9[163768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:37:59 compute-0 sudo[163766]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:00 compute-0 sudo[163889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbvqohwwoxzmkfmyelifijezdzjvllif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873079.159384-526-133095701086067/AnsiballZ_copy.py'
Oct 07 21:38:00 compute-0 sudo[163889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:00 compute-0 python3.9[163891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873079.159384-526-133095701086067/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:00 compute-0 sudo[163889]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:00 compute-0 podman[163916]: 2025-10-07 21:38:00.848093685 +0000 UTC m=+0.079540875 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 21:38:01 compute-0 sshd-session[163461]: Failed password for sync from 116.110.151.5 port 54744 ssh2
Oct 07 21:38:01 compute-0 sudo[164060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeurydikrbucfvvtepvxtlileroaohpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873080.9857893-560-68616460703379/AnsiballZ_file.py'
Oct 07 21:38:01 compute-0 sudo[164060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:01 compute-0 python3.9[164062]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:01 compute-0 sudo[164060]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:02 compute-0 sshd-session[163461]: Connection closed by authenticating user sync 116.110.151.5 port 54744 [preauth]
Oct 07 21:38:02 compute-0 sudo[164212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylsqqrlyfxfjspvsjohnsgvbrjpafpok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873081.8578877-576-87997108869614/AnsiballZ_stat.py'
Oct 07 21:38:02 compute-0 sudo[164212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:02 compute-0 python3.9[164214]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:02 compute-0 sudo[164212]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:02 compute-0 sudo[164335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpgwzprcrswhpddcbfpmpdobjeicunfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873081.8578877-576-87997108869614/AnsiballZ_copy.py'
Oct 07 21:38:02 compute-0 sudo[164335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:03 compute-0 python3.9[164337]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873081.8578877-576-87997108869614/.source.json _original_basename=.umupq9yn follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:03 compute-0 sudo[164335]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:03 compute-0 sudo[164487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djtodqfuyirogpzsmxtvkckghguwaeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873083.3797753-606-260572821876307/AnsiballZ_file.py'
Oct 07 21:38:03 compute-0 sudo[164487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:03 compute-0 python3.9[164489]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:03 compute-0 sudo[164487]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:04 compute-0 sudo[164639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-citagkalilokqznhfizltfuxgfqcxeip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873084.3340747-622-108792338873671/AnsiballZ_stat.py'
Oct 07 21:38:04 compute-0 sudo[164639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:04 compute-0 sudo[164639]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:05 compute-0 sudo[164762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yehqupizezzhbnthjucludoyxcmqcgno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873084.3340747-622-108792338873671/AnsiballZ_copy.py'
Oct 07 21:38:05 compute-0 sudo[164762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:05 compute-0 sudo[164762]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:06 compute-0 sudo[164914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vimmncdwzmncmvuffvjgkrqlgfjsiddc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873086.0443525-656-196373190624744/AnsiballZ_container_config_data.py'
Oct 07 21:38:06 compute-0 sudo[164914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:06 compute-0 python3.9[164916]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 07 21:38:06 compute-0 sudo[164914]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:07 compute-0 sudo[165066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vujxtrebotevplxksysotqgmcpjeyada ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873087.1004293-674-38711452294109/AnsiballZ_container_config_hash.py'
Oct 07 21:38:07 compute-0 sudo[165066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:07 compute-0 python3.9[165068]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:38:07 compute-0 sudo[165066]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:08 compute-0 sudo[165218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zastuiaacrguevaixkgleiqucsyvkgqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873088.1823328-692-125465964363389/AnsiballZ_podman_container_info.py'
Oct 07 21:38:08 compute-0 sudo[165218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:08 compute-0 python3.9[165220]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 07 21:38:09 compute-0 sudo[165218]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:10 compute-0 sudo[165396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orabumtahchzhlgeeoixkfuxiazjmhqw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873089.9038568-718-88962016311284/AnsiballZ_edpm_container_manage.py'
Oct 07 21:38:10 compute-0 sudo[165396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:10 compute-0 python3[165398]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:38:11 compute-0 podman[165434]: 2025-10-07 21:38:11.079998461 +0000 UTC m=+0.071997067 container create bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, container_name=iscsid)
Oct 07 21:38:11 compute-0 podman[165434]: 2025-10-07 21:38:11.045664548 +0000 UTC m=+0.037663194 image pull 2c0acbe8b07baed3b27d0202cd594c4edfd15616d3c28ad8374e80ebca74a2a1 38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 07 21:38:11 compute-0 python3[165398]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 07 21:38:11 compute-0 sudo[165396]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:11 compute-0 sudo[165621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkklpqelbtlnffjqtnarshnngakciker ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873091.459689-734-52401681692451/AnsiballZ_stat.py'
Oct 07 21:38:11 compute-0 sudo[165621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:11 compute-0 python3.9[165623]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:38:12 compute-0 sudo[165621]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:12 compute-0 sudo[165775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxyjdfeqjqjoqzymqyfaoqfzjogjcsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873092.4716175-752-77597357541209/AnsiballZ_file.py'
Oct 07 21:38:12 compute-0 sudo[165775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:13 compute-0 python3.9[165777]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:13 compute-0 sudo[165775]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:13 compute-0 sudo[165851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjgxlvkmujcdubosormcxpodccelmeli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873092.4716175-752-77597357541209/AnsiballZ_stat.py'
Oct 07 21:38:13 compute-0 sudo[165851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:13 compute-0 python3.9[165853]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:38:13 compute-0 sudo[165851]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:14 compute-0 sudo[166002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgwttpnwgwvuqzhyjzfplnehsikyace ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873093.510378-752-96054226780164/AnsiballZ_copy.py'
Oct 07 21:38:14 compute-0 sudo[166002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:14 compute-0 python3.9[166004]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759873093.510378-752-96054226780164/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:14 compute-0 sudo[166002]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:14 compute-0 sudo[166078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kadmeobizimzitcztwqdcyvtommrrgfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873093.510378-752-96054226780164/AnsiballZ_systemd.py'
Oct 07 21:38:14 compute-0 sudo[166078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:15 compute-0 python3.9[166080]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:38:15 compute-0 systemd[1]: Reloading.
Oct 07 21:38:15 compute-0 systemd-sysv-generator[166113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:38:15 compute-0 systemd-rc-local-generator[166109]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:38:15 compute-0 sudo[166078]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:15 compute-0 sudo[166189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhtcfiazebwlgegaefmewstuchvaqyed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873093.510378-752-96054226780164/AnsiballZ_systemd.py'
Oct 07 21:38:15 compute-0 sudo[166189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:16 compute-0 python3.9[166191]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:38:16 compute-0 systemd[1]: Reloading.
Oct 07 21:38:16 compute-0 systemd-sysv-generator[166219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:38:16 compute-0 systemd-rc-local-generator[166215]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:38:16 compute-0 systemd[1]: Starting iscsid container...
Oct 07 21:38:16 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:38:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9db5240783c9462b9a9dc27aa3a2a787b1ac64fb8b082af1a50b96bcb1857cae/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 07 21:38:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9db5240783c9462b9a9dc27aa3a2a787b1ac64fb8b082af1a50b96bcb1857cae/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 07 21:38:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9db5240783c9462b9a9dc27aa3a2a787b1ac64fb8b082af1a50b96bcb1857cae/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 07 21:38:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71.
Oct 07 21:38:16 compute-0 podman[166231]: 2025-10-07 21:38:16.71901175 +0000 UTC m=+0.211709759 container init bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid)
Oct 07 21:38:16 compute-0 iscsid[166246]: + sudo -E kolla_set_configs
Oct 07 21:38:16 compute-0 sudo[166252]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 07 21:38:16 compute-0 podman[166231]: 2025-10-07 21:38:16.768939938 +0000 UTC m=+0.261637937 container start bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 07 21:38:16 compute-0 podman[166231]: iscsid
Oct 07 21:38:16 compute-0 systemd[1]: Started iscsid container.
Oct 07 21:38:16 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 07 21:38:16 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 07 21:38:16 compute-0 sudo[166189]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:16 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 07 21:38:16 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 07 21:38:16 compute-0 systemd[166266]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 07 21:38:16 compute-0 podman[166253]: 2025-10-07 21:38:16.884489151 +0000 UTC m=+0.101384752 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 21:38:16 compute-0 systemd[1]: bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71-166a9894120b0a34.service: Main process exited, code=exited, status=1/FAILURE
Oct 07 21:38:16 compute-0 systemd[1]: bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71-166a9894120b0a34.service: Failed with result 'exit-code'.
Oct 07 21:38:16 compute-0 systemd[166266]: Queued start job for default target Main User Target.
Oct 07 21:38:17 compute-0 systemd[166266]: Created slice User Application Slice.
Oct 07 21:38:17 compute-0 systemd[166266]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 07 21:38:17 compute-0 systemd[166266]: Started Daily Cleanup of User's Temporary Directories.
Oct 07 21:38:17 compute-0 systemd[166266]: Reached target Paths.
Oct 07 21:38:17 compute-0 systemd[166266]: Reached target Timers.
Oct 07 21:38:17 compute-0 systemd[166266]: Starting D-Bus User Message Bus Socket...
Oct 07 21:38:17 compute-0 systemd[166266]: Starting Create User's Volatile Files and Directories...
Oct 07 21:38:17 compute-0 systemd[166266]: Finished Create User's Volatile Files and Directories.
Oct 07 21:38:17 compute-0 systemd[166266]: Listening on D-Bus User Message Bus Socket.
Oct 07 21:38:17 compute-0 systemd[166266]: Reached target Sockets.
Oct 07 21:38:17 compute-0 systemd[166266]: Reached target Basic System.
Oct 07 21:38:17 compute-0 systemd[166266]: Reached target Main User Target.
Oct 07 21:38:17 compute-0 systemd[166266]: Startup finished in 143ms.
Oct 07 21:38:17 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 07 21:38:17 compute-0 systemd[1]: Started Session c3 of User root.
Oct 07 21:38:17 compute-0 sudo[166252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 07 21:38:17 compute-0 iscsid[166246]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:38:17 compute-0 iscsid[166246]: INFO:__main__:Validating config file
Oct 07 21:38:17 compute-0 iscsid[166246]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:38:17 compute-0 iscsid[166246]: INFO:__main__:Writing out command to execute
Oct 07 21:38:17 compute-0 sudo[166252]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:17 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 07 21:38:17 compute-0 iscsid[166246]: ++ cat /run_command
Oct 07 21:38:17 compute-0 iscsid[166246]: + CMD='/usr/sbin/iscsid -f'
Oct 07 21:38:17 compute-0 iscsid[166246]: + ARGS=
Oct 07 21:38:17 compute-0 iscsid[166246]: + sudo kolla_copy_cacerts
Oct 07 21:38:17 compute-0 sudo[166314]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 07 21:38:17 compute-0 systemd[1]: Started Session c4 of User root.
Oct 07 21:38:17 compute-0 sudo[166314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 07 21:38:17 compute-0 sudo[166314]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:17 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 07 21:38:17 compute-0 iscsid[166246]: + [[ ! -n '' ]]
Oct 07 21:38:17 compute-0 iscsid[166246]: + . kolla_extend_start
Oct 07 21:38:17 compute-0 iscsid[166246]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 07 21:38:17 compute-0 iscsid[166246]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 07 21:38:17 compute-0 iscsid[166246]: Running command: '/usr/sbin/iscsid -f'
Oct 07 21:38:17 compute-0 iscsid[166246]: + umask 0022
Oct 07 21:38:17 compute-0 iscsid[166246]: + exec /usr/sbin/iscsid -f
Oct 07 21:38:17 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Oct 07 21:38:17 compute-0 python3.9[166450]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:38:18 compute-0 sudo[166600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmdazqjkowpbyjpgskeunlkffhxtlrdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873098.304346-826-170508159588660/AnsiballZ_file.py'
Oct 07 21:38:18 compute-0 sudo[166600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:18 compute-0 python3.9[166602]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:18 compute-0 sudo[166600]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:19 compute-0 sudo[166752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiswjesfvtgjbktuymyzvinpsgolcmjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873099.3184893-848-200869120117016/AnsiballZ_service_facts.py'
Oct 07 21:38:19 compute-0 sudo[166752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:19 compute-0 python3.9[166754]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:38:19 compute-0 network[166771]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:38:19 compute-0 network[166772]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:38:19 compute-0 network[166773]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:38:24 compute-0 sudo[166752]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:38:25.574 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:38:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:38:25.576 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:38:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:38:25.576 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:38:25 compute-0 podman[166921]: 2025-10-07 21:38:25.915207248 +0000 UTC m=+0.139623612 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 21:38:26 compute-0 sudo[167072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrzeqmfpysqtshoyqzeovkpdilbgsxkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873106.407986-868-265419375154977/AnsiballZ_file.py'
Oct 07 21:38:26 compute-0 sudo[167072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:26 compute-0 python3.9[167074]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 07 21:38:26 compute-0 sudo[167072]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:27 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 07 21:38:27 compute-0 systemd[166266]: Activating special unit Exit the Session...
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped target Main User Target.
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped target Basic System.
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped target Paths.
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped target Sockets.
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped target Timers.
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 07 21:38:27 compute-0 systemd[166266]: Closed D-Bus User Message Bus Socket.
Oct 07 21:38:27 compute-0 systemd[166266]: Stopped Create User's Volatile Files and Directories.
Oct 07 21:38:27 compute-0 systemd[166266]: Removed slice User Application Slice.
Oct 07 21:38:27 compute-0 systemd[166266]: Reached target Shutdown.
Oct 07 21:38:27 compute-0 systemd[166266]: Finished Exit the Session.
Oct 07 21:38:27 compute-0 systemd[166266]: Reached target Exit the Session.
Oct 07 21:38:27 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 07 21:38:27 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 07 21:38:27 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 07 21:38:27 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 07 21:38:27 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 07 21:38:27 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 07 21:38:27 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 07 21:38:27 compute-0 sudo[167227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeanfnfbghbgceevjghlsjlzlvyzyouu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873107.215585-884-263684167040208/AnsiballZ_modprobe.py'
Oct 07 21:38:27 compute-0 sudo[167227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:27 compute-0 python3.9[167229]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 07 21:38:27 compute-0 sudo[167227]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:28 compute-0 sudo[167383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcxrduolcouhtiqxkcloneuclezooako ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873108.1970072-900-136270741740474/AnsiballZ_stat.py'
Oct 07 21:38:28 compute-0 sudo[167383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:28 compute-0 python3.9[167385]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:28 compute-0 sudo[167383]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:29 compute-0 sudo[167506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iysckizvkpaclzbxuxyapwxgmbqcpckb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873108.1970072-900-136270741740474/AnsiballZ_copy.py'
Oct 07 21:38:29 compute-0 sudo[167506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:29 compute-0 python3.9[167508]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873108.1970072-900-136270741740474/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:29 compute-0 sudo[167506]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:30 compute-0 unix_chkpwd[167636]: password check failed for user (root)
Oct 07 21:38:30 compute-0 sshd-session[167084]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:38:30 compute-0 sudo[167659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugjxtqxoknvjcvlqrdbmxzhmzojwmffj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873109.646713-932-174841636408922/AnsiballZ_lineinfile.py'
Oct 07 21:38:30 compute-0 sudo[167659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:30 compute-0 python3.9[167661]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:30 compute-0 sudo[167659]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:30 compute-0 sudo[167811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgnuvvqgngdjsorfgitupfhhsozpnxdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873110.5313902-948-230381190116487/AnsiballZ_systemd.py'
Oct 07 21:38:30 compute-0 sudo[167811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:31 compute-0 podman[167813]: 2025-10-07 21:38:30.998082539 +0000 UTC m=+0.066753880 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 07 21:38:31 compute-0 python3.9[167814]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:38:31 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 07 21:38:31 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 07 21:38:31 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 07 21:38:31 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 07 21:38:31 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 07 21:38:31 compute-0 sudo[167811]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:31 compute-0 sudo[167986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjpinkonfmpwfrmvhtaltcyjfhvhxgej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873111.6141722-964-245225794199639/AnsiballZ_file.py'
Oct 07 21:38:31 compute-0 sudo[167986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:32 compute-0 python3.9[167988]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:32 compute-0 sshd-session[167084]: Failed password for root from 116.110.151.5 port 51706 ssh2
Oct 07 21:38:32 compute-0 sudo[167986]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:32 compute-0 sudo[168138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdfohydajoxemycqgiurdxwejpniuogh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873112.5584476-982-66682365558200/AnsiballZ_stat.py'
Oct 07 21:38:32 compute-0 sudo[168138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:33 compute-0 python3.9[168140]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:38:33 compute-0 sudo[168138]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:33 compute-0 sudo[168290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thedhimqgdpokupdomwcijokmfyuygmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873113.4154613-1000-171882142732291/AnsiballZ_stat.py'
Oct 07 21:38:33 compute-0 sudo[168290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:33 compute-0 python3.9[168292]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:38:33 compute-0 sudo[168290]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:34 compute-0 sshd-session[167084]: Connection closed by authenticating user root 116.110.151.5 port 51706 [preauth]
Oct 07 21:38:34 compute-0 sudo[168442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odzftcyqfbqnwbpzjhtculpsylhtjgvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873114.1548612-1016-35206936484814/AnsiballZ_stat.py'
Oct 07 21:38:34 compute-0 sudo[168442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:34 compute-0 python3.9[168444]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:34 compute-0 sudo[168442]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:35 compute-0 sudo[168565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zodjmltocfiuwrpxvpddblmylldrvrpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873114.1548612-1016-35206936484814/AnsiballZ_copy.py'
Oct 07 21:38:35 compute-0 sudo[168565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:35 compute-0 python3.9[168567]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873114.1548612-1016-35206936484814/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:35 compute-0 sudo[168565]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:36 compute-0 sudo[168717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqmlumuocmhdaznqgbcbsfwsgsenotey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873115.684459-1046-169964355091482/AnsiballZ_command.py'
Oct 07 21:38:36 compute-0 sudo[168717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:36 compute-0 python3.9[168719]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:38:36 compute-0 sudo[168717]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:37 compute-0 sudo[168870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucvyzpkshshygtvavxzblmgbfmqogutq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873116.7047565-1062-257945708876682/AnsiballZ_lineinfile.py'
Oct 07 21:38:37 compute-0 sudo[168870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:37 compute-0 python3.9[168872]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:37 compute-0 sudo[168870]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:37 compute-0 sudo[169022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqecwnhabjdybmgewdepvfjnhpzopwuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873117.4720144-1078-151271245828324/AnsiballZ_replace.py'
Oct 07 21:38:37 compute-0 sudo[169022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:38 compute-0 python3.9[169024]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:38 compute-0 sudo[169022]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:38 compute-0 sudo[169174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvgeyogcuhdlxggedjenjszuykifsfem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873118.314631-1094-254411414751365/AnsiballZ_replace.py'
Oct 07 21:38:38 compute-0 sudo[169174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:38 compute-0 python3.9[169176]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:38 compute-0 sudo[169174]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:39 compute-0 sudo[169326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwsgnchchelljqwzsndviduwrvblytft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873119.1431308-1112-214076161697515/AnsiballZ_lineinfile.py'
Oct 07 21:38:39 compute-0 sudo[169326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:39 compute-0 python3.9[169328]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:39 compute-0 sudo[169326]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:40 compute-0 sudo[169478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxktxjihcjytszywqrwwcyvqkqzkmcdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873119.7959938-1112-202539843529032/AnsiballZ_lineinfile.py'
Oct 07 21:38:40 compute-0 sudo[169478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:40 compute-0 python3.9[169480]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:40 compute-0 sudo[169478]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:40 compute-0 sudo[169630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xisvevdtalndfxdvqbpjqtvsmlsrwbcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873120.505051-1112-140778853268515/AnsiballZ_lineinfile.py'
Oct 07 21:38:40 compute-0 sudo[169630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:41 compute-0 python3.9[169632]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:41 compute-0 sudo[169630]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:41 compute-0 sudo[169782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkydfpxrmjqebzjfcydoudaaruqcilzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873121.2604697-1112-118743682561928/AnsiballZ_lineinfile.py'
Oct 07 21:38:41 compute-0 sudo[169782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:41 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 07 21:38:41 compute-0 python3.9[169784]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:41 compute-0 sudo[169782]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:42 compute-0 sudo[169935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tynbzczuvnwufiyovvkozydwajnaljbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873122.2235608-1170-190830684984843/AnsiballZ_stat.py'
Oct 07 21:38:42 compute-0 sudo[169935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:42 compute-0 python3.9[169937]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:38:42 compute-0 sudo[169935]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:42 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 21:38:43 compute-0 sudo[170090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuyuqdbrbcpmpzfzppgjzwpasxfccuec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873123.0055542-1186-84885283457018/AnsiballZ_file.py'
Oct 07 21:38:43 compute-0 sudo[170090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:43 compute-0 python3.9[170092]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:43 compute-0 sudo[170090]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:44 compute-0 sudo[170242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwflvxsjmwdnnpmjbxrpzmmqukosrlyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873123.9092224-1204-12189861037104/AnsiballZ_file.py'
Oct 07 21:38:44 compute-0 sudo[170242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:44 compute-0 python3.9[170244]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:44 compute-0 sudo[170242]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:44 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 07 21:38:45 compute-0 sudo[170395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkhrvrflgblxiswikedqyhqbfovwkmwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873124.7849026-1220-23046587683407/AnsiballZ_stat.py'
Oct 07 21:38:45 compute-0 sudo[170395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:45 compute-0 python3.9[170397]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:45 compute-0 sudo[170395]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:45 compute-0 sudo[170473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcnxxugzpfytuzvibyagrsjxusfpflml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873124.7849026-1220-23046587683407/AnsiballZ_file.py'
Oct 07 21:38:45 compute-0 sudo[170473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:45 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 07 21:38:45 compute-0 python3.9[170475]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:45 compute-0 sudo[170473]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:46 compute-0 sudo[170626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezkkzavqsazseikywhzpkqhytieozsoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873126.0429358-1220-238460928582726/AnsiballZ_stat.py'
Oct 07 21:38:46 compute-0 sudo[170626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:46 compute-0 python3.9[170628]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:46 compute-0 sudo[170626]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:46 compute-0 sudo[170704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scxtmcvwlqqoxgjjquxldssjoyyfdmry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873126.0429358-1220-238460928582726/AnsiballZ_file.py'
Oct 07 21:38:46 compute-0 sudo[170704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:47 compute-0 podman[170706]: 2025-10-07 21:38:47.005821075 +0000 UTC m=+0.072576863 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:38:47 compute-0 python3.9[170707]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:47 compute-0 sudo[170704]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:47 compute-0 sudo[170878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsnckosmyikxokrvuqrafprjmyybpbiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873127.3900845-1266-236100929302331/AnsiballZ_file.py'
Oct 07 21:38:47 compute-0 sudo[170878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:47 compute-0 python3.9[170880]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:47 compute-0 sudo[170878]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:48 compute-0 sudo[171030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdlpwqjqrckejiucdumahvkffzzxsabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873128.1612394-1282-8150231731742/AnsiballZ_stat.py'
Oct 07 21:38:48 compute-0 sudo[171030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:48 compute-0 python3.9[171032]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:48 compute-0 sudo[171030]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:49 compute-0 sudo[171108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnvumppltnvzuauxiiqziujqhfczxvcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873128.1612394-1282-8150231731742/AnsiballZ_file.py'
Oct 07 21:38:49 compute-0 sudo[171108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:49 compute-0 python3.9[171110]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:49 compute-0 sudo[171108]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:49 compute-0 sudo[171260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nraozqmmttqeotlphzpjjwwilltmqaod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873129.4749167-1306-121976042904000/AnsiballZ_stat.py'
Oct 07 21:38:49 compute-0 sudo[171260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:50 compute-0 python3.9[171262]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:50 compute-0 sudo[171260]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:50 compute-0 sudo[171338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnczzelxgyvcqzvlkmrdrgmtdbxvltoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873129.4749167-1306-121976042904000/AnsiballZ_file.py'
Oct 07 21:38:50 compute-0 sudo[171338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:50 compute-0 python3.9[171340]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:50 compute-0 sudo[171338]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:51 compute-0 sudo[171490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kysdfvwyktfngtotpetzghvmliqwnknj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873130.9507906-1330-203941770127522/AnsiballZ_systemd.py'
Oct 07 21:38:51 compute-0 sudo[171490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:51 compute-0 python3.9[171492]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:38:51 compute-0 systemd[1]: Reloading.
Oct 07 21:38:51 compute-0 systemd-rc-local-generator[171520]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:38:51 compute-0 systemd-sysv-generator[171523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:38:52 compute-0 sudo[171490]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:52 compute-0 sudo[171679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppzwkczliqxaognxrtajqpfolyohsang ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873132.339889-1346-114966829791970/AnsiballZ_stat.py'
Oct 07 21:38:52 compute-0 sudo[171679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:52 compute-0 python3.9[171681]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:52 compute-0 sudo[171679]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:53 compute-0 sudo[171757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlojnlcjjkdgywrlrcknmwgqllboljn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873132.339889-1346-114966829791970/AnsiballZ_file.py'
Oct 07 21:38:53 compute-0 sudo[171757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:53 compute-0 python3.9[171759]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:53 compute-0 sudo[171757]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:54 compute-0 sudo[171909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icdzavmzwxnygnpgxxxnpymosvjpsevz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873133.6529233-1370-93569739147915/AnsiballZ_stat.py'
Oct 07 21:38:54 compute-0 sudo[171909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:54 compute-0 python3.9[171911]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:54 compute-0 sudo[171909]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:54 compute-0 sudo[171987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnrypeahnpzihhcfmxdxiwisrufcytpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873133.6529233-1370-93569739147915/AnsiballZ_file.py'
Oct 07 21:38:54 compute-0 sudo[171987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:54 compute-0 python3.9[171989]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:38:54 compute-0 sudo[171987]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:55 compute-0 sudo[172139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkdfpldfvbwdoctzvyxzztodtyvymqle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873135.015259-1394-167166684636904/AnsiballZ_systemd.py'
Oct 07 21:38:55 compute-0 sudo[172139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:55 compute-0 python3.9[172141]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:38:55 compute-0 systemd[1]: Reloading.
Oct 07 21:38:55 compute-0 systemd-rc-local-generator[172165]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:38:55 compute-0 systemd-sysv-generator[172168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:38:56 compute-0 systemd[1]: Starting Create netns directory...
Oct 07 21:38:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 07 21:38:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 07 21:38:56 compute-0 systemd[1]: Finished Create netns directory.
Oct 07 21:38:56 compute-0 sudo[172139]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:56 compute-0 podman[172178]: 2025-10-07 21:38:56.102378086 +0000 UTC m=+0.131433797 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 07 21:38:56 compute-0 sudo[172358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrdablrykzvtdegjfvqeujpovjrtvyvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873136.42497-1414-174656780451348/AnsiballZ_file.py'
Oct 07 21:38:56 compute-0 sudo[172358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:56 compute-0 python3.9[172360]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:56 compute-0 sudo[172358]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:57 compute-0 sudo[172512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfznmujzodpxhioovaaqocothmiptmrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873137.2775571-1430-55939354251660/AnsiballZ_stat.py'
Oct 07 21:38:57 compute-0 sudo[172512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:57 compute-0 python3.9[172514]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:38:57 compute-0 sudo[172512]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:58 compute-0 sudo[172635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfexmveasshtiwzisgslplutrktcyzfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873137.2775571-1430-55939354251660/AnsiballZ_copy.py'
Oct 07 21:38:58 compute-0 sudo[172635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:58 compute-0 python3.9[172637]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873137.2775571-1430-55939354251660/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:58 compute-0 sudo[172635]: pam_unix(sudo:session): session closed for user root
Oct 07 21:38:59 compute-0 sudo[172787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zneanmihdeiyiszehshkeuivbltdkjfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873139.2373018-1464-165547073086394/AnsiballZ_file.py'
Oct 07 21:38:59 compute-0 sudo[172787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:38:59 compute-0 python3.9[172789]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:38:59 compute-0 sudo[172787]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:00 compute-0 sudo[172939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwosrgoawsaajusffsnbptyhhcqwafrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873140.1225808-1480-169385616663267/AnsiballZ_stat.py'
Oct 07 21:39:00 compute-0 sudo[172939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:00 compute-0 python3.9[172941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:39:00 compute-0 sudo[172939]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:01 compute-0 sudo[173072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsjlivatwdsdmvubuaapupasrlwxcedt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873140.1225808-1480-169385616663267/AnsiballZ_copy.py'
Oct 07 21:39:01 compute-0 sudo[173072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:01 compute-0 podman[173036]: 2025-10-07 21:39:01.16963476 +0000 UTC m=+0.078610773 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:39:01 compute-0 python3.9[173080]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873140.1225808-1480-169385616663267/.source.json _original_basename=.a_jasvch follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:01 compute-0 sudo[173072]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:02 compute-0 sudo[173230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yosejlcetukimopyctzsatqfcqsmkjqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873141.6207383-1510-212307545056689/AnsiballZ_file.py'
Oct 07 21:39:02 compute-0 sudo[173230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:02 compute-0 python3.9[173232]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:02 compute-0 sudo[173230]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:02 compute-0 sudo[173382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckjhlxzywtwbwyozevfdhwnqjdcannxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873142.4883387-1526-9517483925768/AnsiballZ_stat.py'
Oct 07 21:39:02 compute-0 sudo[173382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:02 compute-0 sudo[173382]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:03 compute-0 sudo[173505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsszqbbxqfofskybsxtlzekqicpejypt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873142.4883387-1526-9517483925768/AnsiballZ_copy.py'
Oct 07 21:39:03 compute-0 sudo[173505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:03 compute-0 sudo[173505]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:04 compute-0 sudo[173657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlwuxzwxredfftgzxhlfqlzdeoacrkkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873144.163785-1560-186680059808592/AnsiballZ_container_config_data.py'
Oct 07 21:39:04 compute-0 sudo[173657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:04 compute-0 python3.9[173659]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 07 21:39:04 compute-0 sudo[173657]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:05 compute-0 sudo[173809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jikciteyzbzcsunljnkfguujhyxilcua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873145.0981662-1578-59808572173892/AnsiballZ_container_config_hash.py'
Oct 07 21:39:05 compute-0 sudo[173809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:05 compute-0 sshd-session[172385]: Invalid user 1234 from 116.110.151.5 port 59944
Oct 07 21:39:05 compute-0 python3.9[173811]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:39:05 compute-0 sudo[173809]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:05 compute-0 sshd-session[172385]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:39:05 compute-0 sshd-session[172385]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:39:06 compute-0 sudo[173961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrdnjyjxajcesljotymvknykawfsiri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873146.0085053-1596-139442642431281/AnsiballZ_podman_container_info.py'
Oct 07 21:39:06 compute-0 sudo[173961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:06 compute-0 python3.9[173963]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 07 21:39:06 compute-0 sudo[173961]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:07 compute-0 sudo[174139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlybeeshbzjcqiuzdnjnjrvfrlbumwte ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873147.4879282-1622-104111277027163/AnsiballZ_edpm_container_manage.py'
Oct 07 21:39:07 compute-0 sudo[174139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:07 compute-0 sshd-session[172385]: Failed password for invalid user 1234 from 116.110.151.5 port 59944 ssh2
Oct 07 21:39:08 compute-0 python3[174141]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:39:08 compute-0 podman[174179]: 2025-10-07 21:39:08.349988486 +0000 UTC m=+0.077895072 container create c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 07 21:39:08 compute-0 podman[174179]: 2025-10-07 21:39:08.303426129 +0000 UTC m=+0.031332705 image pull 4681127ca41b9c0ad73cf128c4c3175cc608608dca0d6e6910829324a5619ecd 38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 07 21:39:08 compute-0 python3[174141]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 07 21:39:08 compute-0 sudo[174139]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:09 compute-0 sudo[174367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojaczlrwbqqvuggrztqtwopwotqqscbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873148.7966015-1638-135006392062036/AnsiballZ_stat.py'
Oct 07 21:39:09 compute-0 sudo[174367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:09 compute-0 python3.9[174369]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:39:09 compute-0 sudo[174367]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:10 compute-0 sudo[174521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbjlbxdkufgggvnfwvuebpmaknuagoon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873149.7178311-1656-133711392500664/AnsiballZ_file.py'
Oct 07 21:39:10 compute-0 sudo[174521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:10 compute-0 python3.9[174523]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:10 compute-0 sudo[174521]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:10 compute-0 sudo[174597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wofrhziyypfkmmztuzrcxasemsvgigsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873149.7178311-1656-133711392500664/AnsiballZ_stat.py'
Oct 07 21:39:10 compute-0 sudo[174597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:10 compute-0 python3.9[174599]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:39:10 compute-0 sudo[174597]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:11 compute-0 sudo[174748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiadfqrjrhsqwknscinpuyivjmpmjsdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873150.8741672-1656-244190655628003/AnsiballZ_copy.py'
Oct 07 21:39:11 compute-0 sudo[174748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:11 compute-0 python3.9[174750]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759873150.8741672-1656-244190655628003/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:11 compute-0 sudo[174748]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:11 compute-0 sudo[174824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfwhmtvkzwgorwhhpxthenrshvmagvni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873150.8741672-1656-244190655628003/AnsiballZ_systemd.py'
Oct 07 21:39:11 compute-0 sudo[174824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:12 compute-0 python3.9[174826]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:39:12 compute-0 systemd[1]: Reloading.
Oct 07 21:39:12 compute-0 systemd-rc-local-generator[174853]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:39:12 compute-0 systemd-sysv-generator[174856]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:39:12 compute-0 sudo[174824]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:12 compute-0 sudo[174934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftbtshpfqisngdwzwbwdzrtdqisljnmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873150.8741672-1656-244190655628003/AnsiballZ_systemd.py'
Oct 07 21:39:12 compute-0 sudo[174934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:13 compute-0 python3.9[174936]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:13 compute-0 systemd[1]: Reloading.
Oct 07 21:39:13 compute-0 systemd-rc-local-generator[174960]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:39:13 compute-0 systemd-sysv-generator[174965]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:39:13 compute-0 systemd[1]: Starting multipathd container...
Oct 07 21:39:13 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:39:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c21db7cf8aca75d33215d36f3105b34378e902b4905556688822f232bd9cb086/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 07 21:39:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c21db7cf8aca75d33215d36f3105b34378e902b4905556688822f232bd9cb086/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 07 21:39:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.
Oct 07 21:39:13 compute-0 podman[174976]: 2025-10-07 21:39:13.759111867 +0000 UTC m=+0.145119110 container init c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 21:39:13 compute-0 multipathd[174990]: + sudo -E kolla_set_configs
Oct 07 21:39:13 compute-0 podman[174976]: 2025-10-07 21:39:13.79662674 +0000 UTC m=+0.182633883 container start c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:39:13 compute-0 podman[174976]: multipathd
Oct 07 21:39:13 compute-0 sudo[174996]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 07 21:39:13 compute-0 sudo[174996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 07 21:39:13 compute-0 systemd[1]: Started multipathd container.
Oct 07 21:39:13 compute-0 sudo[174934]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:13 compute-0 multipathd[174990]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:39:13 compute-0 multipathd[174990]: INFO:__main__:Validating config file
Oct 07 21:39:13 compute-0 multipathd[174990]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:39:13 compute-0 multipathd[174990]: INFO:__main__:Writing out command to execute
Oct 07 21:39:13 compute-0 sudo[174996]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:13 compute-0 multipathd[174990]: ++ cat /run_command
Oct 07 21:39:13 compute-0 multipathd[174990]: + CMD='/usr/sbin/multipathd -d'
Oct 07 21:39:13 compute-0 multipathd[174990]: + ARGS=
Oct 07 21:39:13 compute-0 multipathd[174990]: + sudo kolla_copy_cacerts
Oct 07 21:39:13 compute-0 sudo[175022]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 07 21:39:13 compute-0 sudo[175022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 07 21:39:13 compute-0 sudo[175022]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:13 compute-0 multipathd[174990]: + [[ ! -n '' ]]
Oct 07 21:39:13 compute-0 multipathd[174990]: + . kolla_extend_start
Oct 07 21:39:13 compute-0 multipathd[174990]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 07 21:39:13 compute-0 multipathd[174990]: Running command: '/usr/sbin/multipathd -d'
Oct 07 21:39:13 compute-0 multipathd[174990]: + umask 0022
Oct 07 21:39:13 compute-0 multipathd[174990]: + exec /usr/sbin/multipathd -d
Oct 07 21:39:13 compute-0 podman[174997]: 2025-10-07 21:39:13.924659752 +0000 UTC m=+0.110906324 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct 07 21:39:13 compute-0 multipathd[174990]: 2887.566176 | multipathd v0.9.9: start up
Oct 07 21:39:13 compute-0 multipathd[174990]: 2887.573246 | reconfigure: setting up paths and maps
Oct 07 21:39:13 compute-0 systemd[1]: c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2-17b261a11bb191e5.service: Main process exited, code=exited, status=1/FAILURE
Oct 07 21:39:13 compute-0 multipathd[174990]: 2887.574962 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Oct 07 21:39:13 compute-0 systemd[1]: c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2-17b261a11bb191e5.service: Failed with result 'exit-code'.
Oct 07 21:39:13 compute-0 multipathd[174990]: 2887.576162 | updated bindings file /etc/multipath/bindings
Oct 07 21:39:14 compute-0 python3.9[175180]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:39:15 compute-0 sudo[175332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rajnjoscsqhjylbawlzlchtbykzpfdrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873155.0790617-1728-133877054192413/AnsiballZ_command.py'
Oct 07 21:39:15 compute-0 sudo[175332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:15 compute-0 python3.9[175334]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:39:15 compute-0 sudo[175332]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:16 compute-0 sudo[175497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvglqbucrqzfuvcpcrxiqvbburjcwhzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873156.028444-1744-184661005971981/AnsiballZ_systemd.py'
Oct 07 21:39:16 compute-0 sudo[175497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:16 compute-0 python3.9[175499]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:39:16 compute-0 systemd[1]: Stopping multipathd container...
Oct 07 21:39:16 compute-0 multipathd[174990]: 2890.488189 | multipathd: shut down
Oct 07 21:39:16 compute-0 systemd[1]: libpod-c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.scope: Deactivated successfully.
Oct 07 21:39:16 compute-0 podman[175503]: 2025-10-07 21:39:16.890987066 +0000 UTC m=+0.096532864 container died c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:39:16 compute-0 systemd[1]: c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2-17b261a11bb191e5.timer: Deactivated successfully.
Oct 07 21:39:16 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.
Oct 07 21:39:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2-userdata-shm.mount: Deactivated successfully.
Oct 07 21:39:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-c21db7cf8aca75d33215d36f3105b34378e902b4905556688822f232bd9cb086-merged.mount: Deactivated successfully.
Oct 07 21:39:16 compute-0 podman[175503]: 2025-10-07 21:39:16.942040314 +0000 UTC m=+0.147586042 container cleanup c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:39:16 compute-0 podman[175503]: multipathd
Oct 07 21:39:17 compute-0 podman[175529]: multipathd
Oct 07 21:39:17 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 07 21:39:17 compute-0 systemd[1]: Stopped multipathd container.
Oct 07 21:39:17 compute-0 systemd[1]: Starting multipathd container...
Oct 07 21:39:17 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:39:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c21db7cf8aca75d33215d36f3105b34378e902b4905556688822f232bd9cb086/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 07 21:39:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c21db7cf8aca75d33215d36f3105b34378e902b4905556688822f232bd9cb086/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 07 21:39:17 compute-0 podman[175540]: 2025-10-07 21:39:17.149686056 +0000 UTC m=+0.081509046 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 07 21:39:17 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.
Oct 07 21:39:17 compute-0 podman[175542]: 2025-10-07 21:39:17.193088281 +0000 UTC m=+0.130243177 container init c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 21:39:17 compute-0 sshd-session[172385]: Connection closed by invalid user 1234 116.110.151.5 port 59944 [preauth]
Oct 07 21:39:17 compute-0 multipathd[175567]: + sudo -E kolla_set_configs
Oct 07 21:39:17 compute-0 sudo[175582]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 07 21:39:17 compute-0 sudo[175582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 07 21:39:17 compute-0 podman[175542]: 2025-10-07 21:39:17.22119961 +0000 UTC m=+0.158354486 container start c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 07 21:39:17 compute-0 podman[175542]: multipathd
Oct 07 21:39:17 compute-0 systemd[1]: Started multipathd container.
Oct 07 21:39:17 compute-0 sudo[175497]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:17 compute-0 multipathd[175567]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:39:17 compute-0 multipathd[175567]: INFO:__main__:Validating config file
Oct 07 21:39:17 compute-0 multipathd[175567]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:39:17 compute-0 multipathd[175567]: INFO:__main__:Writing out command to execute
Oct 07 21:39:17 compute-0 sudo[175582]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:17 compute-0 multipathd[175567]: ++ cat /run_command
Oct 07 21:39:17 compute-0 multipathd[175567]: + CMD='/usr/sbin/multipathd -d'
Oct 07 21:39:17 compute-0 multipathd[175567]: + ARGS=
Oct 07 21:39:17 compute-0 multipathd[175567]: + sudo kolla_copy_cacerts
Oct 07 21:39:17 compute-0 podman[175583]: 2025-10-07 21:39:17.311745268 +0000 UTC m=+0.074361467 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007)
Oct 07 21:39:17 compute-0 sudo[175608]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 07 21:39:17 compute-0 systemd[1]: c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2-5c7182ca3232e63a.service: Main process exited, code=exited, status=1/FAILURE
Oct 07 21:39:17 compute-0 systemd[1]: c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2-5c7182ca3232e63a.service: Failed with result 'exit-code'.
Oct 07 21:39:17 compute-0 sudo[175608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 07 21:39:17 compute-0 sudo[175608]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:17 compute-0 multipathd[175567]: + [[ ! -n '' ]]
Oct 07 21:39:17 compute-0 multipathd[175567]: + . kolla_extend_start
Oct 07 21:39:17 compute-0 multipathd[175567]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 07 21:39:17 compute-0 multipathd[175567]: Running command: '/usr/sbin/multipathd -d'
Oct 07 21:39:17 compute-0 multipathd[175567]: + umask 0022
Oct 07 21:39:17 compute-0 multipathd[175567]: + exec /usr/sbin/multipathd -d
Oct 07 21:39:17 compute-0 multipathd[175567]: 2890.983837 | multipathd v0.9.9: start up
Oct 07 21:39:17 compute-0 multipathd[175567]: 2890.991687 | reconfigure: setting up paths and maps
Oct 07 21:39:17 compute-0 sudo[175764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfpwqqxcrsoinghwdsuqrlcckjndfhxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873157.42921-1760-263428084666165/AnsiballZ_file.py'
Oct 07 21:39:17 compute-0 sudo[175764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:17 compute-0 python3.9[175766]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:17 compute-0 sudo[175764]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:18 compute-0 sudo[175916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyuutbgrryxfugrvocshufogjmmbthnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873158.4638817-1784-5616855178778/AnsiballZ_file.py'
Oct 07 21:39:18 compute-0 sudo[175916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:19 compute-0 python3.9[175918]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 07 21:39:19 compute-0 sudo[175916]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:19 compute-0 sudo[176068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szgifhonnhzjsinbgnvzkpttnmvtowsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873159.3962958-1800-75293086805786/AnsiballZ_modprobe.py'
Oct 07 21:39:19 compute-0 sudo[176068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:20 compute-0 python3.9[176070]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 07 21:39:20 compute-0 kernel: Key type psk registered
Oct 07 21:39:20 compute-0 sudo[176068]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:20 compute-0 sudo[176232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgoyqmldxwzogbmxoygzsqbhtmlnndjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873160.3619835-1816-107494934197613/AnsiballZ_stat.py'
Oct 07 21:39:20 compute-0 sudo[176232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:20 compute-0 python3.9[176234]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:39:20 compute-0 sudo[176232]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:21 compute-0 sudo[176355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjwwhtnqvjajfrbwhylktwpakzbktgfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873160.3619835-1816-107494934197613/AnsiballZ_copy.py'
Oct 07 21:39:21 compute-0 sudo[176355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:21 compute-0 python3.9[176357]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873160.3619835-1816-107494934197613/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:21 compute-0 sudo[176355]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:22 compute-0 sudo[176507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnlhwzwkxlmuravnuhjsrxhbpvpzpkrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873161.929067-1848-50633597892880/AnsiballZ_lineinfile.py'
Oct 07 21:39:22 compute-0 sudo[176507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:22 compute-0 python3.9[176509]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:22 compute-0 sudo[176507]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:23 compute-0 sudo[176659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmxkdjaknufvjkiuftcmmlbtxgckzqca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873162.7120647-1864-279849610247490/AnsiballZ_systemd.py'
Oct 07 21:39:23 compute-0 sudo[176659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:23 compute-0 python3.9[176661]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:39:23 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 07 21:39:23 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 07 21:39:23 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 07 21:39:23 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 07 21:39:23 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 07 21:39:23 compute-0 sudo[176659]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:24 compute-0 sudo[176815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glcnwznbxmoeljkoxhlfpzqrczixrcoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873163.7996204-1880-38553077558465/AnsiballZ_setup.py'
Oct 07 21:39:24 compute-0 sudo[176815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:24 compute-0 python3.9[176817]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 07 21:39:24 compute-0 sudo[176815]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:25 compute-0 sudo[176901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjvjwtpgqmlrzoaayxtghzjrlaipiggc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873163.7996204-1880-38553077558465/AnsiballZ_dnf.py'
Oct 07 21:39:25 compute-0 sudo[176901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:25 compute-0 python3.9[176903]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 07 21:39:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:39:25.577 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:39:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:39:25.577 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:39:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:39:25.578 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:39:26 compute-0 sshd-session[176818]: Invalid user nikita from 116.110.151.5 port 51250
Oct 07 21:39:26 compute-0 sshd-session[176818]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:39:26 compute-0 sshd-session[176818]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:39:26 compute-0 podman[176906]: 2025-10-07 21:39:26.867655133 +0000 UTC m=+0.132170203 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4)
Oct 07 21:39:28 compute-0 sshd-session[176818]: Failed password for invalid user nikita from 116.110.151.5 port 51250 ssh2
Oct 07 21:39:30 compute-0 sshd-session[176818]: Connection closed by invalid user nikita 116.110.151.5 port 51250 [preauth]
Oct 07 21:39:31 compute-0 systemd[1]: Reloading.
Oct 07 21:39:31 compute-0 podman[176938]: 2025-10-07 21:39:31.694262562 +0000 UTC m=+0.074063680 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 07 21:39:31 compute-0 systemd-rc-local-generator[176972]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:39:31 compute-0 systemd-sysv-generator[176975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:39:31 compute-0 systemd[1]: Reloading.
Oct 07 21:39:32 compute-0 systemd-rc-local-generator[177017]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:39:32 compute-0 systemd-sysv-generator[177022]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:39:32 compute-0 systemd-logind[798]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 07 21:39:32 compute-0 systemd-logind[798]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 07 21:39:32 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 07 21:39:32 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 07 21:39:32 compute-0 systemd[1]: Reloading.
Oct 07 21:39:32 compute-0 systemd-sysv-generator[177116]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:39:32 compute-0 systemd-rc-local-generator[177111]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:39:32 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 07 21:39:33 compute-0 sudo[176901]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 07 21:39:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 07 21:39:34 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.779s CPU time.
Oct 07 21:39:34 compute-0 systemd[1]: run-r41464b6b3ca5494ea97a01e77e0451a1.service: Deactivated successfully.
Oct 07 21:39:34 compute-0 sudo[178400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfdivnebrtdxytdfxaasmczlomriiof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873173.8414085-1904-41428884822735/AnsiballZ_file.py'
Oct 07 21:39:34 compute-0 sudo[178400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:34 compute-0 python3.9[178402]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:34 compute-0 sudo[178400]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:35 compute-0 python3.9[178552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:39:36 compute-0 sudo[178706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowkpizzpekeubapzxmcofhgtbukvdht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873175.9490361-1939-204240729995629/AnsiballZ_file.py'
Oct 07 21:39:36 compute-0 sudo[178706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:36 compute-0 python3.9[178708]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:36 compute-0 sudo[178706]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:37 compute-0 sudo[178860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktwcbgkrkvfzhpgeexuyorecaeceauvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873176.9975395-1961-192811545686378/AnsiballZ_systemd_service.py'
Oct 07 21:39:37 compute-0 sudo[178860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:38 compute-0 python3.9[178862]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:39:38 compute-0 systemd[1]: Reloading.
Oct 07 21:39:38 compute-0 systemd-rc-local-generator[178889]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:39:38 compute-0 systemd-sysv-generator[178892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:39:38 compute-0 sudo[178860]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:38 compute-0 sshd-session[178785]: Invalid user mc from 103.115.24.11 port 53972
Oct 07 21:39:38 compute-0 sshd-session[178785]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:39:38 compute-0 sshd-session[178785]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 21:39:39 compute-0 python3.9[179046]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:39:39 compute-0 network[179063]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:39:39 compute-0 network[179064]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:39:39 compute-0 network[179065]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:39:41 compute-0 sshd-session[178785]: Failed password for invalid user mc from 103.115.24.11 port 53972 ssh2
Oct 07 21:39:41 compute-0 sshd-session[178785]: Received disconnect from 103.115.24.11 port 53972:11: Bye Bye [preauth]
Oct 07 21:39:41 compute-0 sshd-session[178785]: Disconnected from invalid user mc 103.115.24.11 port 53972 [preauth]
Oct 07 21:39:43 compute-0 sudo[179340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifgsvplusszzsizuyrwdbywqhbgvkpww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873183.6535609-1999-195903172948066/AnsiballZ_systemd_service.py'
Oct 07 21:39:44 compute-0 sudo[179340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:44 compute-0 python3.9[179342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:44 compute-0 sudo[179340]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:44 compute-0 sudo[179493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksebrhgtebsbbpkjjbjocsdyshroyvth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873184.5709443-1999-889343866543/AnsiballZ_systemd_service.py'
Oct 07 21:39:44 compute-0 sudo[179493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:45 compute-0 python3.9[179495]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:45 compute-0 sudo[179493]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:45 compute-0 sudo[179646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzsesxgxnhnhscqoxlfpanxatgoxowbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873185.4106445-1999-245143293659704/AnsiballZ_systemd_service.py'
Oct 07 21:39:45 compute-0 sudo[179646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:46 compute-0 python3.9[179648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:46 compute-0 sudo[179646]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:46 compute-0 sudo[179799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqzexqcvghsdjdnrzkxqdqoexaddfkin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873186.246375-1999-149792765804786/AnsiballZ_systemd_service.py'
Oct 07 21:39:46 compute-0 sudo[179799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:46 compute-0 python3.9[179801]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:46 compute-0 sudo[179799]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:47 compute-0 podman[179927]: 2025-10-07 21:39:47.47650096 +0000 UTC m=+0.055832028 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:39:47 compute-0 sudo[179988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdxutqotvqasmlprqtvgpuyciicpaxij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873187.1247518-1999-31644778304779/AnsiballZ_systemd_service.py'
Oct 07 21:39:47 compute-0 sudo[179988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:47 compute-0 podman[179926]: 2025-10-07 21:39:47.510717077 +0000 UTC m=+0.089489109 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251007)
Oct 07 21:39:47 compute-0 python3.9[179993]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:47 compute-0 sudo[179988]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:48 compute-0 sudo[180144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apdemghwpquzsmbvaenhtnzcoduwjfyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873187.9877882-1999-16152863057158/AnsiballZ_systemd_service.py'
Oct 07 21:39:48 compute-0 sudo[180144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:48 compute-0 python3.9[180146]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:48 compute-0 sudo[180144]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:49 compute-0 sudo[180297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdezlimeetcznkihwwnogbcmqvzgtacs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873188.8941772-1999-115380187446978/AnsiballZ_systemd_service.py'
Oct 07 21:39:49 compute-0 sudo[180297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:49 compute-0 python3.9[180299]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:49 compute-0 sudo[180297]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:50 compute-0 sudo[180450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lihvgmbopdxsljjylwtfulildbxerjrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873189.818785-1999-280315360429327/AnsiballZ_systemd_service.py'
Oct 07 21:39:50 compute-0 sudo[180450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:50 compute-0 python3.9[180452]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:39:50 compute-0 sudo[180450]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:52 compute-0 sudo[180605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzrswzeyolmeogavnnsiguhxjtguhlda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873192.108883-2117-260121410529608/AnsiballZ_file.py'
Oct 07 21:39:52 compute-0 sudo[180605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:52 compute-0 python3.9[180607]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:52 compute-0 sudo[180605]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:53 compute-0 sudo[180757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcmlafcoanoldlhyyvfbtbazfdmxgkzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873192.9250758-2117-25533103761384/AnsiballZ_file.py'
Oct 07 21:39:53 compute-0 sudo[180757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:53 compute-0 python3.9[180759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:53 compute-0 sudo[180757]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:53 compute-0 sudo[180909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcorrhiskgvdsccedtilecrdfspwyjda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873193.6052656-2117-246721474401025/AnsiballZ_file.py'
Oct 07 21:39:53 compute-0 sudo[180909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:54 compute-0 python3.9[180911]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:54 compute-0 sudo[180909]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:54 compute-0 unix_chkpwd[181035]: password check failed for user (root)
Oct 07 21:39:54 compute-0 sshd-session[180501]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:39:54 compute-0 sudo[181062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnhwlcsyqxieeerieacmdtdtvkwksmfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873194.3457608-2117-239280818252623/AnsiballZ_file.py'
Oct 07 21:39:54 compute-0 sudo[181062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:54 compute-0 python3.9[181064]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:54 compute-0 sudo[181062]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:55 compute-0 sudo[181214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxcgdorxyvsnhugvfeyxiszzkkpqwedu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873195.1178677-2117-26199318787642/AnsiballZ_file.py'
Oct 07 21:39:55 compute-0 sudo[181214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:55 compute-0 python3.9[181216]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:55 compute-0 sudo[181214]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:56 compute-0 sudo[181366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrabbjqebumocmcctxcdcfcsklnwmrwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873195.935997-2117-54874542177493/AnsiballZ_file.py'
Oct 07 21:39:56 compute-0 sudo[181366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:56 compute-0 sshd-session[180501]: Failed password for root from 116.110.151.5 port 46854 ssh2
Oct 07 21:39:56 compute-0 python3.9[181368]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:56 compute-0 sudo[181366]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:57 compute-0 sudo[181529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbsgcxjwdsawtaqiuccqgcpkjjznmsan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873196.7287376-2117-114529441872961/AnsiballZ_file.py'
Oct 07 21:39:57 compute-0 sudo[181529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:57 compute-0 podman[181492]: 2025-10-07 21:39:57.17744457 +0000 UTC m=+0.145856192 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Oct 07 21:39:57 compute-0 python3.9[181538]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:57 compute-0 sudo[181529]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:57 compute-0 sshd-session[180501]: Connection closed by authenticating user root 116.110.151.5 port 46854 [preauth]
Oct 07 21:39:57 compute-0 sudo[181697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsqdlatgwpzqkokxypjxjmjtpugwjwwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873197.497002-2117-256959373338848/AnsiballZ_file.py'
Oct 07 21:39:57 compute-0 sudo[181697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:58 compute-0 python3.9[181699]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:58 compute-0 sudo[181697]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:58 compute-0 sudo[181849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmyrzaankrtgwgpiabeaijzwahmohoye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873198.2673283-2231-237129244993136/AnsiballZ_file.py'
Oct 07 21:39:58 compute-0 sudo[181849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:58 compute-0 python3.9[181851]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:58 compute-0 sudo[181849]: pam_unix(sudo:session): session closed for user root
Oct 07 21:39:59 compute-0 sudo[182002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxagcyzfiwqxpspiattbneynbitricts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873199.0397487-2231-29701982732262/AnsiballZ_file.py'
Oct 07 21:39:59 compute-0 sudo[182002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:39:59 compute-0 python3.9[182004]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:39:59 compute-0 sudo[182002]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:00 compute-0 sudo[182154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prhwlmtqklhzbikednshwrsxavwueiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873199.813445-2231-264621252907181/AnsiballZ_file.py'
Oct 07 21:40:00 compute-0 sudo[182154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:00 compute-0 python3.9[182156]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:00 compute-0 sudo[182154]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:00 compute-0 sudo[182306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqdfeuftdjjpnaduugzrntdtqvmrtzhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873200.528433-2231-274148056719442/AnsiballZ_file.py'
Oct 07 21:40:00 compute-0 sudo[182306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:01 compute-0 python3.9[182308]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:01 compute-0 sudo[182306]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:01 compute-0 sudo[182458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tejnlulidzvvlmpulhtcjephhynbnyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873201.2572339-2231-146945494375223/AnsiballZ_file.py'
Oct 07 21:40:01 compute-0 sudo[182458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:01 compute-0 python3.9[182460]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:01 compute-0 sudo[182458]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:02 compute-0 podman[182584]: 2025-10-07 21:40:02.316185717 +0000 UTC m=+0.064497771 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:40:02 compute-0 sudo[182625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtgrorfuairihsjdnrmwobbaoywlzmix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873201.9369938-2231-235192191151753/AnsiballZ_file.py'
Oct 07 21:40:02 compute-0 sudo[182625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:02 compute-0 python3.9[182629]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:02 compute-0 sudo[182625]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:03 compute-0 sudo[182780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaxrvtpltkueuwwglqlsrlhvaroexnic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873202.7186484-2231-260640765355193/AnsiballZ_file.py'
Oct 07 21:40:03 compute-0 sudo[182780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:03 compute-0 python3.9[182782]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:03 compute-0 sudo[182780]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:03 compute-0 sudo[182932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnkvdaaejbopcpslpvgmdzabyapljxig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873203.4951491-2231-72801608702302/AnsiballZ_file.py'
Oct 07 21:40:03 compute-0 sudo[182932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:03 compute-0 python3.9[182934]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:04 compute-0 sudo[182932]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:04 compute-0 sudo[183084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlofidqrjjipszovjsprmpqihqesvqgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873204.4802291-2347-17715392214807/AnsiballZ_command.py'
Oct 07 21:40:04 compute-0 sudo[183084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:05 compute-0 python3.9[183086]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:05 compute-0 sudo[183084]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:06 compute-0 python3.9[183238]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 07 21:40:06 compute-0 sudo[183388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okhcsvvhbfwaniuuwgmnnhbvktjsxrma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873206.4790845-2383-23916573282135/AnsiballZ_systemd_service.py'
Oct 07 21:40:06 compute-0 sudo[183388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:07 compute-0 python3.9[183390]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:40:07 compute-0 systemd[1]: Reloading.
Oct 07 21:40:07 compute-0 systemd-rc-local-generator[183413]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:40:07 compute-0 systemd-sysv-generator[183419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:40:07 compute-0 sudo[183388]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:08 compute-0 sudo[183574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipvqfnbmtfwjnappmzkrzrvotypsstuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873207.7395291-2399-148661285696789/AnsiballZ_command.py'
Oct 07 21:40:08 compute-0 sudo[183574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:08 compute-0 python3.9[183576]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:08 compute-0 sudo[183574]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:08 compute-0 sudo[183727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njmcyjhbonrdymvfzvltrlomtlrakbdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873208.5071893-2399-156604305677169/AnsiballZ_command.py'
Oct 07 21:40:08 compute-0 sudo[183727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:09 compute-0 python3.9[183729]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:09 compute-0 sudo[183727]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:09 compute-0 sudo[183880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrgxyonqqichwocfqpnvdcebgbdlcuir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873209.270651-2399-246549692953352/AnsiballZ_command.py'
Oct 07 21:40:09 compute-0 sudo[183880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:09 compute-0 python3.9[183882]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:09 compute-0 sudo[183880]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:10 compute-0 sudo[184033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmwzyqpciyggkjjcujltxqzbwoqxhbxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873210.074311-2399-10355429206953/AnsiballZ_command.py'
Oct 07 21:40:10 compute-0 sudo[184033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:10 compute-0 python3.9[184035]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:10 compute-0 sudo[184033]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:11 compute-0 sudo[184186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzsykjvmuidgomfielhfmtnjjfjambys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873210.8473527-2399-232336731474529/AnsiballZ_command.py'
Oct 07 21:40:11 compute-0 sudo[184186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:11 compute-0 python3.9[184188]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:11 compute-0 sudo[184186]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:11 compute-0 sudo[184339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inlauiltukmomhgvrbzzagtembtcoksa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873211.5733776-2399-274948404451200/AnsiballZ_command.py'
Oct 07 21:40:11 compute-0 sudo[184339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:12 compute-0 python3.9[184341]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:12 compute-0 sudo[184339]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:12 compute-0 sudo[184492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspihyydfzrvpdxdrxqhjpxskweocair ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873212.318532-2399-232583829044922/AnsiballZ_command.py'
Oct 07 21:40:12 compute-0 sudo[184492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:12 compute-0 python3.9[184494]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:12 compute-0 sudo[184492]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:13 compute-0 sudo[184647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbxyhdzrasohlcetunwofufyevscmtpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873213.0768533-2399-58487372436891/AnsiballZ_command.py'
Oct 07 21:40:13 compute-0 sudo[184647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:13 compute-0 python3.9[184649]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:40:13 compute-0 sudo[184647]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:15 compute-0 sudo[184800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoogfkrwyqxezontonlwznnhasjkztnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873214.8087883-2542-145676257009882/AnsiballZ_file.py'
Oct 07 21:40:15 compute-0 sudo[184800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:15 compute-0 python3.9[184802]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:15 compute-0 sudo[184800]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:15 compute-0 sshd-session[184595]: Invalid user admin from 116.110.151.5 port 49794
Oct 07 21:40:15 compute-0 sshd-session[184595]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:40:15 compute-0 sshd-session[184595]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:40:15 compute-0 sudo[184952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaryryuuwhemiogxfipnvjtajcoaouay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873215.5898254-2542-118519060866397/AnsiballZ_file.py'
Oct 07 21:40:15 compute-0 sudo[184952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:16 compute-0 python3.9[184954]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:16 compute-0 sudo[184952]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:16 compute-0 sudo[185104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lypoqjfutoiukrykfvtbavbsbmgrqhyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873216.3204727-2542-121066230551423/AnsiballZ_file.py'
Oct 07 21:40:16 compute-0 sudo[185104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:16 compute-0 python3.9[185106]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:16 compute-0 sudo[185104]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:17 compute-0 sudo[185256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efivrlfkjhubvguedohguchrzbzasqtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873217.1752956-2586-260645385715098/AnsiballZ_file.py'
Oct 07 21:40:17 compute-0 sudo[185256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:17 compute-0 podman[185258]: 2025-10-07 21:40:17.688539469 +0000 UTC m=+0.093542547 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 21:40:17 compute-0 podman[185259]: 2025-10-07 21:40:17.688522378 +0000 UTC m=+0.094069442 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 21:40:17 compute-0 python3.9[185260]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:17 compute-0 sudo[185256]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:17 compute-0 sshd-session[184595]: Failed password for invalid user admin from 116.110.151.5 port 49794 ssh2
Oct 07 21:40:18 compute-0 sudo[185447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrojtgufixiynakitfbwbbsmhfvgpcoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873217.9636352-2586-259209245509001/AnsiballZ_file.py'
Oct 07 21:40:18 compute-0 sudo[185447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:18 compute-0 python3.9[185449]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:18 compute-0 sudo[185447]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:19 compute-0 sudo[185599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhiojiwwkgettspbrgsabhbsexdbjjbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873218.7554188-2586-235596888947081/AnsiballZ_file.py'
Oct 07 21:40:19 compute-0 sudo[185599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:19 compute-0 python3.9[185601]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:19 compute-0 sudo[185599]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:19 compute-0 sudo[185751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihjsudqydzqdhdngxoyzyjnfsolctrej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873219.4936442-2586-42887916801698/AnsiballZ_file.py'
Oct 07 21:40:19 compute-0 sudo[185751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:20 compute-0 python3.9[185753]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:20 compute-0 sudo[185751]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:20 compute-0 sshd-session[184595]: Connection closed by invalid user admin 116.110.151.5 port 49794 [preauth]
Oct 07 21:40:20 compute-0 sudo[185903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcaygrbmwdkpxixpzzsgpwvbtkwfhqmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873220.3039181-2586-128929103428895/AnsiballZ_file.py'
Oct 07 21:40:20 compute-0 sudo[185903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:20 compute-0 python3.9[185905]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:20 compute-0 sudo[185903]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:21 compute-0 sudo[186055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wladfanubhkvtkvrjgphacgxqibpasev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873221.0628374-2586-192842057493990/AnsiballZ_file.py'
Oct 07 21:40:21 compute-0 sudo[186055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:21 compute-0 python3.9[186057]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:21 compute-0 sudo[186055]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:22 compute-0 sudo[186207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkxdmxhjooicbkrzrcogjjvdpygnykqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873221.7932913-2586-177173001078346/AnsiballZ_file.py'
Oct 07 21:40:22 compute-0 sudo[186207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:22 compute-0 python3.9[186209]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:22 compute-0 sudo[186207]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:22 compute-0 sudo[186359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiomwogwfalllvfjxkzhglvqzijndgpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873222.5395935-2586-225114716102434/AnsiballZ_file.py'
Oct 07 21:40:22 compute-0 sudo[186359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:23 compute-0 python3.9[186361]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:23 compute-0 sudo[186359]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:23 compute-0 sudo[186511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxenifzlqnrdrsiplmuhytieufqdlfaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873223.2610073-2586-222106569142806/AnsiballZ_file.py'
Oct 07 21:40:23 compute-0 sudo[186511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:23 compute-0 python3.9[186513]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:23 compute-0 sudo[186511]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:40:25.579 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:40:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:40:25.579 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:40:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:40:25.579 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:40:27 compute-0 podman[186539]: 2025-10-07 21:40:27.962246763 +0000 UTC m=+0.181880372 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 21:40:28 compute-0 sudo[186691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zidlfdkmkurglyfsaletzoplucrhbxwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873228.418679-2851-121484831562625/AnsiballZ_getent.py'
Oct 07 21:40:28 compute-0 sudo[186691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:29 compute-0 python3.9[186693]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 07 21:40:29 compute-0 sudo[186691]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:29 compute-0 sudo[186844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfylrowaxlojxtzgnaqfvjsrmrblhefo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873229.417669-2867-275051975916355/AnsiballZ_group.py'
Oct 07 21:40:29 compute-0 sudo[186844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:30 compute-0 python3.9[186846]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 07 21:40:30 compute-0 groupadd[186847]: group added to /etc/group: name=nova, GID=42436
Oct 07 21:40:30 compute-0 groupadd[186847]: group added to /etc/gshadow: name=nova
Oct 07 21:40:30 compute-0 groupadd[186847]: new group: name=nova, GID=42436
Oct 07 21:40:30 compute-0 sudo[186844]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:31 compute-0 sudo[187002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayhjwgmscwayrziukfiqtskimhxajiwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873230.505839-2883-175108870433920/AnsiballZ_user.py'
Oct 07 21:40:31 compute-0 sudo[187002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:31 compute-0 python3.9[187004]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 07 21:40:31 compute-0 useradd[187006]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Oct 07 21:40:31 compute-0 useradd[187006]: add 'nova' to group 'libvirt'
Oct 07 21:40:31 compute-0 useradd[187006]: add 'nova' to shadow group 'libvirt'
Oct 07 21:40:31 compute-0 sudo[187002]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:32 compute-0 sshd-session[187037]: Accepted publickey for zuul from 192.168.122.30 port 36018 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:40:32 compute-0 systemd-logind[798]: New session 27 of user zuul.
Oct 07 21:40:32 compute-0 systemd[1]: Started Session 27 of User zuul.
Oct 07 21:40:32 compute-0 sshd-session[187037]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:40:32 compute-0 podman[187040]: 2025-10-07 21:40:32.550409472 +0000 UTC m=+0.099702887 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:40:32 compute-0 sshd-session[187054]: Received disconnect from 192.168.122.30 port 36018:11: disconnected by user
Oct 07 21:40:32 compute-0 sshd-session[187054]: Disconnected from user zuul 192.168.122.30 port 36018
Oct 07 21:40:32 compute-0 sshd-session[187037]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:40:32 compute-0 systemd-logind[798]: Session 27 logged out. Waiting for processes to exit.
Oct 07 21:40:32 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Oct 07 21:40:32 compute-0 systemd-logind[798]: Removed session 27.
Oct 07 21:40:33 compute-0 python3.9[187210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:34 compute-0 python3.9[187331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873232.813299-2933-60831965127741/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:34 compute-0 python3.9[187481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:35 compute-0 python3.9[187557]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:36 compute-0 python3.9[187707]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:36 compute-0 python3.9[187828]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873235.6315255-2933-135055288548477/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:37 compute-0 python3.9[187978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:38 compute-0 python3.9[188099]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873237.1037834-2933-130578743992871/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:39 compute-0 python3.9[188249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:39 compute-0 python3.9[188370]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873238.570408-2933-205649672299134/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:40 compute-0 sudo[188520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eipaxpwxmlnulqmmvspvobqcszvuxfdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873240.0474486-3071-263539184006728/AnsiballZ_file.py'
Oct 07 21:40:40 compute-0 sudo[188520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:40 compute-0 python3.9[188522]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:40 compute-0 sudo[188520]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:41 compute-0 sudo[188672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppkjyqtqarvxbqpcwdaombsotyvgdfai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873240.8540409-3087-173245664260151/AnsiballZ_copy.py'
Oct 07 21:40:41 compute-0 sudo[188672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:41 compute-0 python3.9[188674]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:41 compute-0 sudo[188672]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:42 compute-0 sudo[188824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffkumpjahmmcnpedjraidhroozohguox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873241.6834993-3103-37907295390266/AnsiballZ_stat.py'
Oct 07 21:40:42 compute-0 sudo[188824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:42 compute-0 python3.9[188826]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:40:42 compute-0 sudo[188824]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:42 compute-0 sudo[188976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unupezirozspptpqicbntiwraxeawoxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873242.5213852-3119-133046422446486/AnsiballZ_stat.py'
Oct 07 21:40:42 compute-0 sudo[188976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:43 compute-0 python3.9[188978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:43 compute-0 sudo[188976]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:43 compute-0 sudo[189099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oioltwwxwpackeutselvlitvtfaftisc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873242.5213852-3119-133046422446486/AnsiballZ_copy.py'
Oct 07 21:40:43 compute-0 sudo[189099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:43 compute-0 python3.9[189101]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759873242.5213852-3119-133046422446486/.source _original_basename=.zrl9m6v1 follow=False checksum=353d37032e212f995dd80e7689b9ea5a671f492a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 07 21:40:43 compute-0 sudo[189099]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:44 compute-0 python3.9[189253]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:40:45 compute-0 python3.9[189405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:46 compute-0 python3.9[189526]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873244.9554806-3171-124397738541106/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=fa6ab27a406eeb4f681a9061f63acfd4959e5e19 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:46 compute-0 python3.9[189676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:40:47 compute-0 python3.9[189797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873246.3899715-3201-241831384289636/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=4ccbfbbec3a54ab751d21d6ebc2f8f3644ff85c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:40:47 compute-0 podman[189799]: 2025-10-07 21:40:47.835993925 +0000 UTC m=+0.070428633 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct 07 21:40:47 compute-0 podman[189798]: 2025-10-07 21:40:47.83854281 +0000 UTC m=+0.073630327 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 21:40:48 compute-0 sudo[189987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhjavnquhdgcpuhkmqxkopmtcvvkvbcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873248.3246417-3235-256683733927317/AnsiballZ_container_config_data.py'
Oct 07 21:40:48 compute-0 sudo[189987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:48 compute-0 python3.9[189989]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 07 21:40:48 compute-0 sudo[189987]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:49 compute-0 sudo[190139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dauedynmvinblcoraybbgetohuxavolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873249.252526-3253-245995707716159/AnsiballZ_container_config_hash.py'
Oct 07 21:40:49 compute-0 sudo[190139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:49 compute-0 python3.9[190141]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:40:49 compute-0 sudo[190139]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:50 compute-0 sudo[190291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvkkhemytgdmowoywkkrslnoaufkbjxa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873250.212679-3273-85205589746881/AnsiballZ_edpm_container_manage.py'
Oct 07 21:40:50 compute-0 sudo[190291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:50 compute-0 python3[190293]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:40:51 compute-0 podman[190330]: 2025-10-07 21:40:51.064284673 +0000 UTC m=+0.050932516 container create 9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm)
Oct 07 21:40:51 compute-0 podman[190330]: 2025-10-07 21:40:51.039175061 +0000 UTC m=+0.025822934 image pull e40a82bb5768ddbb81728291deee4da0629f3c0ac149f80011e3a69d48a891a3 38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 07 21:40:51 compute-0 python3[190293]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 07 21:40:51 compute-0 sudo[190291]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:51 compute-0 sudo[190519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dubvzkktwhnuwrpxytkazphwrmsqjzsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873251.6653967-3289-256921259119824/AnsiballZ_stat.py'
Oct 07 21:40:51 compute-0 sudo[190519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:52 compute-0 python3.9[190521]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:40:52 compute-0 sudo[190519]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:53 compute-0 sudo[190673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-refggatidahkrsuzpxfvqrniwbpyetoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873252.9938543-3313-187339825647250/AnsiballZ_container_config_data.py'
Oct 07 21:40:53 compute-0 sudo[190673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:53 compute-0 python3.9[190675]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 07 21:40:53 compute-0 sudo[190673]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:54 compute-0 sudo[190825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxgaezveyxertngnhuodgqbzwuyoksma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873253.8907921-3331-231752243540779/AnsiballZ_container_config_hash.py'
Oct 07 21:40:54 compute-0 sudo[190825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:54 compute-0 python3.9[190827]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:40:54 compute-0 sudo[190825]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:55 compute-0 sudo[190977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flpuqdqmyrdturwlmjrykamxbfiuvqwp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873254.8739796-3351-9759413732651/AnsiballZ_edpm_container_manage.py'
Oct 07 21:40:55 compute-0 sudo[190977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:55 compute-0 python3[190979]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:40:55 compute-0 podman[191016]: 2025-10-07 21:40:55.766057184 +0000 UTC m=+0.068509917 container create b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 07 21:40:55 compute-0 podman[191016]: 2025-10-07 21:40:55.736609576 +0000 UTC m=+0.039062309 image pull e40a82bb5768ddbb81728291deee4da0629f3c0ac149f80011e3a69d48a891a3 38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 07 21:40:55 compute-0 python3[190979]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Oct 07 21:40:55 compute-0 sudo[190977]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:56 compute-0 sudo[191204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghgkhyamzmtdwmgpppyildaxkhdxlirc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873256.1825838-3367-196501802810026/AnsiballZ_stat.py'
Oct 07 21:40:56 compute-0 sudo[191204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:56 compute-0 python3.9[191206]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:40:56 compute-0 sudo[191204]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:57 compute-0 sudo[191358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nprgsljjwudyysmxwfyxigesvsxruaoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873257.1544583-3385-93955617219538/AnsiballZ_file.py'
Oct 07 21:40:57 compute-0 sudo[191358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:57 compute-0 python3.9[191360]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:57 compute-0 sudo[191358]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:58 compute-0 sudo[191521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbohvpgtmtlywavliixinampisudalsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873257.7830439-3385-76276734240994/AnsiballZ_copy.py'
Oct 07 21:40:58 compute-0 sudo[191521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:58 compute-0 podman[191483]: 2025-10-07 21:40:58.445433093 +0000 UTC m=+0.127185147 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 07 21:40:58 compute-0 python3.9[191528]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759873257.7830439-3385-76276734240994/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:40:58 compute-0 sudo[191521]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:58 compute-0 sudo[191611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhscqiclcxhyqkphsjzzdjtzyjvilbgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873257.7830439-3385-76276734240994/AnsiballZ_systemd.py'
Oct 07 21:40:58 compute-0 sudo[191611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:40:59 compute-0 python3.9[191613]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:40:59 compute-0 systemd[1]: Reloading.
Oct 07 21:40:59 compute-0 systemd-rc-local-generator[191641]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:40:59 compute-0 systemd-sysv-generator[191645]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:40:59 compute-0 sudo[191611]: pam_unix(sudo:session): session closed for user root
Oct 07 21:40:59 compute-0 sudo[191723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysgcdeiwazwoixokxkjcjhspxyoholcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873257.7830439-3385-76276734240994/AnsiballZ_systemd.py'
Oct 07 21:40:59 compute-0 sudo[191723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:00 compute-0 python3.9[191725]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:41:00 compute-0 systemd[1]: Reloading.
Oct 07 21:41:00 compute-0 systemd-rc-local-generator[191751]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:41:00 compute-0 systemd-sysv-generator[191756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:41:00 compute-0 systemd[1]: Starting nova_compute container...
Oct 07 21:41:00 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:41:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:00 compute-0 podman[191765]: 2025-10-07 21:41:00.781731765 +0000 UTC m=+0.119140914 container init b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 21:41:00 compute-0 podman[191765]: 2025-10-07 21:41:00.79222133 +0000 UTC m=+0.129630459 container start b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:41:00 compute-0 podman[191765]: nova_compute
Oct 07 21:41:00 compute-0 nova_compute[191780]: + sudo -E kolla_set_configs
Oct 07 21:41:00 compute-0 systemd[1]: Started nova_compute container.
Oct 07 21:41:00 compute-0 sudo[191723]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Validating config file
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying service configuration files
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Deleting /etc/ceph
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Creating directory /etc/ceph
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /etc/ceph
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Writing out command to execute
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:00 compute-0 nova_compute[191780]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 07 21:41:00 compute-0 nova_compute[191780]: ++ cat /run_command
Oct 07 21:41:00 compute-0 nova_compute[191780]: + CMD=nova-compute
Oct 07 21:41:00 compute-0 nova_compute[191780]: + ARGS=
Oct 07 21:41:00 compute-0 nova_compute[191780]: + sudo kolla_copy_cacerts
Oct 07 21:41:00 compute-0 nova_compute[191780]: + [[ ! -n '' ]]
Oct 07 21:41:00 compute-0 nova_compute[191780]: + . kolla_extend_start
Oct 07 21:41:00 compute-0 nova_compute[191780]: Running command: 'nova-compute'
Oct 07 21:41:00 compute-0 nova_compute[191780]: + echo 'Running command: '\''nova-compute'\'''
Oct 07 21:41:00 compute-0 nova_compute[191780]: + umask 0022
Oct 07 21:41:00 compute-0 nova_compute[191780]: + exec nova-compute
Oct 07 21:41:02 compute-0 python3.9[191941]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:41:02 compute-0 podman[192064]: 2025-10-07 21:41:02.85814831 +0000 UTC m=+0.086040008 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 21:41:03 compute-0 python3.9[192109]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.019 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.019 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.019 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.019 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.145 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.175 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.206 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Oct 07 21:41:03 compute-0 nova_compute[191780]: 2025-10-07 21:41:03.208 2 WARNING oslo_config.cfg [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Oct 07 21:41:04 compute-0 python3.9[192265]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.345 2 INFO nova.virt.driver [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.451 2 INFO nova.compute.provider_config [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 07 21:41:04 compute-0 sudo[192417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qurxucclpypgzfjiletupidzkdotuwec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873264.3651114-3505-201381262786880/AnsiballZ_podman_container.py'
Oct 07 21:41:04 compute-0 sudo[192417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.959 2 DEBUG oslo_concurrency.lockutils [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.960 2 DEBUG oslo_concurrency.lockutils [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.961 2 DEBUG oslo_concurrency.lockutils [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.962 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.962 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.962 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.963 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.963 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.964 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.964 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.965 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.965 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.965 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.965 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.966 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.966 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.966 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.967 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.967 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.967 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.968 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.968 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.968 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.968 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.969 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.969 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.969 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.970 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.970 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.970 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.971 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.971 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.971 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.972 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.972 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.972 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.972 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.973 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.973 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.973 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.974 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.974 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.974 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.975 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.975 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.975 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.975 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.976 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.976 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.976 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.977 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.977 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.977 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.978 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.978 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.978 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.979 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.979 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.979 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.979 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.980 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.980 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.980 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.980 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.981 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.981 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.981 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.981 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.982 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.982 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.982 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.983 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.983 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.983 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.984 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.984 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.984 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.985 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.985 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.985 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.986 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.986 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.986 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.987 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.987 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.987 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.987 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.988 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.988 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.988 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.989 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.989 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.989 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.989 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.990 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.990 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.990 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.991 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.991 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.991 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.991 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.992 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.992 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.992 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.993 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.993 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.993 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.993 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.994 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.994 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.994 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.994 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.995 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.995 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.995 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.996 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.996 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.996 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.996 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.997 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.997 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.997 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.997 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.998 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.998 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.998 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.998 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.999 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.999 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:04 compute-0 nova_compute[191780]: 2025-10-07 21:41:04.999 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.000 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.000 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.000 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.000 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.001 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.001 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.001 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.001 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.002 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.002 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.002 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.003 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.003 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.003 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.003 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.004 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.004 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.004 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.005 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.005 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.005 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.006 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.006 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.006 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.007 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.007 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.007 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.008 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.008 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.008 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.009 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.009 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.009 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.009 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.010 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.010 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.010 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.011 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.011 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.011 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.012 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.012 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.012 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.012 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.012 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.012 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.013 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.013 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.013 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.013 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.013 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.013 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.014 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.014 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.014 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.014 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.014 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.014 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.015 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.015 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.015 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.015 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.015 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.015 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.016 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.016 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.016 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.016 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.016 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.016 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.017 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.017 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.017 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.017 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.017 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.018 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.018 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.018 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.018 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.018 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.018 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.019 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.019 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.019 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.019 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.019 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.019 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.020 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.020 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.020 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.020 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.020 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.020 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.021 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.021 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.021 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.021 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.021 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.022 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.022 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.022 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.022 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.022 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.022 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.023 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.023 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.023 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.023 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.023 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.024 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.024 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.024 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.024 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.024 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.024 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.025 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.025 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.025 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.025 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.025 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.025 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.026 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.026 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.026 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.026 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.026 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.026 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.027 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.027 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.027 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.027 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.027 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.027 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.028 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.028 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.028 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.028 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.028 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.029 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.029 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.029 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.029 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.029 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.029 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.030 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.030 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.030 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.030 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.030 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.030 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.031 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.031 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.031 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.031 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.031 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.031 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.032 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.032 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.032 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.032 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.032 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.033 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.033 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.033 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.033 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.033 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.033 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.034 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.034 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.034 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.034 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.034 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.035 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.035 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.035 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.035 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.035 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.035 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.036 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.036 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.036 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.036 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.036 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.036 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.037 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.037 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.037 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.037 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.037 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.037 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.038 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.038 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.038 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.038 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.038 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.038 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.039 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.039 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.039 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.039 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.039 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.039 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.040 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.040 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.042 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.042 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.042 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.042 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.043 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.043 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.043 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.043 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.043 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.043 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.044 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.044 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.044 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.044 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.044 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.044 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.045 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.045 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.045 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.045 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.045 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.045 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.046 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.046 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.046 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.046 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.046 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.046 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.047 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.047 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.047 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.047 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.047 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.047 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.048 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.049 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.049 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.049 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.049 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.049 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.049 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.050 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.051 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.052 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.053 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.054 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 python3.9[192419]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.055 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.056 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.057 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.058 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.059 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.060 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.061 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.062 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.063 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.064 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.064 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.064 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.064 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.064 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.064 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.065 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.066 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.066 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.066 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.066 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.066 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.066 2 WARNING oslo_config.cfg [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 07 21:41:05 compute-0 nova_compute[191780]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 07 21:41:05 compute-0 nova_compute[191780]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 07 21:41:05 compute-0 nova_compute[191780]: and ``live_migration_inbound_addr`` respectively.
Oct 07 21:41:05 compute-0 nova_compute[191780]: ).  Its value may be silently ignored in the future.
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.067 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.068 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.069 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.070 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.071 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.072 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.073 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.074 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.075 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.076 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.076 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.076 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.076 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.076 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.077 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.077 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.077 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.077 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.077 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.077 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.078 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.078 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.078 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.078 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.078 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.078 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.079 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.079 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.079 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.079 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.079 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.079 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.080 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.081 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.082 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.083 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.083 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.083 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.083 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.083 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.083 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.084 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.085 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.085 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.085 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.085 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.085 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.085 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.086 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.087 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.087 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.087 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.087 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.087 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.087 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.088 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.088 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.088 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.088 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.088 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.088 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.089 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.089 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.089 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.089 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.089 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.089 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.090 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.090 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.090 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.090 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.090 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.090 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.091 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.091 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.091 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.091 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.091 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.091 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.092 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.092 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.092 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.092 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.092 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.092 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.093 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.093 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.093 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.093 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.093 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.093 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.094 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.094 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.094 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.094 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.094 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.094 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.095 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.095 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.095 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.095 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.096 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.096 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.096 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.096 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.096 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.096 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.097 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.098 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.098 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.098 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.098 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.098 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.099 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.099 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.099 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.099 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.099 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.099 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.100 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.101 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.101 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.101 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.101 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.101 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.101 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.102 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.103 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.104 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.105 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.106 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.107 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.108 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.109 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.110 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.111 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.112 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.113 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.114 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.115 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.116 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.117 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.118 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.119 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.120 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.121 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.122 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.123 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.124 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.125 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.126 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.126 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.126 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.126 2 DEBUG oslo_service.backend._eventlet.service [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.128 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251007122402.7278e66.el10)
Oct 07 21:41:05 compute-0 sudo[192417]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.636 2 DEBUG nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Oct 07 21:41:05 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 07 21:41:05 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.733 2 DEBUG nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2ac71f4290> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Oct 07 21:41:05 compute-0 nova_compute[191780]: libvirt:  error : internal error: could not initialize domain event timer
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.735 2 WARNING nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.735 2 DEBUG nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2ac71f4290> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.737 2 DEBUG nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.738 2 DEBUG nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.738 2 INFO nova.utils [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] The default thread pool MainProcess.default is initialized
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.739 2 DEBUG nova.virt.libvirt.host [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Oct 07 21:41:05 compute-0 nova_compute[191780]: 2025-10-07 21:41:05.739 2 INFO nova.virt.libvirt.driver [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Connection event '1' reason 'None'
Oct 07 21:41:05 compute-0 sudo[192643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zthvbetqymvaogkdsggrmtglttvmccea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873265.5871232-3521-177510001510582/AnsiballZ_systemd.py'
Oct 07 21:41:05 compute-0 sudo[192643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:06 compute-0 nova_compute[191780]: 2025-10-07 21:41:06.247 2 WARNING nova.virt.libvirt.driver [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 07 21:41:06 compute-0 nova_compute[191780]: 2025-10-07 21:41:06.249 2 DEBUG nova.virt.libvirt.volume.mount [None req-19294ca5-a7f2-4cf8-9fc7-069c379f5e3c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 07 21:41:06 compute-0 python3.9[192645]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:41:06 compute-0 systemd[1]: Stopping nova_compute container...
Oct 07 21:41:06 compute-0 nova_compute[191780]: 2025-10-07 21:41:06.513 2 DEBUG oslo_concurrency.lockutils [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:41:06 compute-0 nova_compute[191780]: 2025-10-07 21:41:06.514 2 DEBUG oslo_concurrency.lockutils [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:41:06 compute-0 nova_compute[191780]: 2025-10-07 21:41:06.514 2 DEBUG oslo_concurrency.lockutils [None req-f72d2be7-4623-41ad-bf99-2359a3a282b7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:41:07 compute-0 virtqemud[192532]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 07 21:41:07 compute-0 systemd[1]: libpod-b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926.scope: Deactivated successfully.
Oct 07 21:41:07 compute-0 virtqemud[192532]: hostname: compute-0
Oct 07 21:41:07 compute-0 systemd[1]: libpod-b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926.scope: Consumed 3.265s CPU time.
Oct 07 21:41:07 compute-0 virtqemud[192532]: End of file while reading data: Input/output error
Oct 07 21:41:07 compute-0 podman[192657]: 2025-10-07 21:41:07.14093989 +0000 UTC m=+0.695609304 container died b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=nova_compute, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible)
Oct 07 21:41:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926-userdata-shm.mount: Deactivated successfully.
Oct 07 21:41:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a-merged.mount: Deactivated successfully.
Oct 07 21:41:07 compute-0 podman[192657]: 2025-10-07 21:41:07.20508196 +0000 UTC m=+0.759751294 container cleanup b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2)
Oct 07 21:41:07 compute-0 podman[192657]: nova_compute
Oct 07 21:41:07 compute-0 podman[192687]: nova_compute
Oct 07 21:41:07 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 07 21:41:07 compute-0 systemd[1]: Stopped nova_compute container.
Oct 07 21:41:07 compute-0 systemd[1]: Starting nova_compute container...
Oct 07 21:41:07 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:41:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8cb0b0796237dbe0ed5e0c62320ec1ec0129aa21971ce50da646d499c170b2a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:07 compute-0 podman[192701]: 2025-10-07 21:41:07.457128265 +0000 UTC m=+0.125698934 container init b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Oct 07 21:41:07 compute-0 podman[192701]: 2025-10-07 21:41:07.467398395 +0000 UTC m=+0.135968974 container start b4aaf72852b786c91155420dda1a94ebd5742188b177eabf48cfb0ef87874926 (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2)
Oct 07 21:41:07 compute-0 podman[192701]: nova_compute
Oct 07 21:41:07 compute-0 systemd[1]: Started nova_compute container.
Oct 07 21:41:07 compute-0 nova_compute[192716]: + sudo -E kolla_set_configs
Oct 07 21:41:07 compute-0 sudo[192643]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Validating config file
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying service configuration files
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /etc/ceph
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Creating directory /etc/ceph
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /etc/ceph
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Writing out command to execute
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:07 compute-0 nova_compute[192716]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 07 21:41:07 compute-0 nova_compute[192716]: ++ cat /run_command
Oct 07 21:41:07 compute-0 nova_compute[192716]: + CMD=nova-compute
Oct 07 21:41:07 compute-0 nova_compute[192716]: + ARGS=
Oct 07 21:41:07 compute-0 nova_compute[192716]: + sudo kolla_copy_cacerts
Oct 07 21:41:07 compute-0 nova_compute[192716]: + [[ ! -n '' ]]
Oct 07 21:41:07 compute-0 nova_compute[192716]: + . kolla_extend_start
Oct 07 21:41:07 compute-0 nova_compute[192716]: Running command: 'nova-compute'
Oct 07 21:41:07 compute-0 nova_compute[192716]: + echo 'Running command: '\''nova-compute'\'''
Oct 07 21:41:07 compute-0 nova_compute[192716]: + umask 0022
Oct 07 21:41:07 compute-0 nova_compute[192716]: + exec nova-compute
Oct 07 21:41:08 compute-0 sudo[192877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlkualbvkejdqlbdowwsptvcoouhtpdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873267.8425744-3539-104818527044372/AnsiballZ_podman_container.py'
Oct 07 21:41:08 compute-0 sudo[192877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:08 compute-0 python3.9[192879]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 07 21:41:08 compute-0 systemd[1]: Started libpod-conmon-9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a.scope.
Oct 07 21:41:08 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:41:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7e421264965b37ed160624bb7b2770a0126be58f83079454944e1e5b1b5e1a8/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7e421264965b37ed160624bb7b2770a0126be58f83079454944e1e5b1b5e1a8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7e421264965b37ed160624bb7b2770a0126be58f83079454944e1e5b1b5e1a8/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 07 21:41:08 compute-0 podman[192901]: 2025-10-07 21:41:08.674893406 +0000 UTC m=+0.142071711 container init 9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 21:41:08 compute-0 podman[192901]: 2025-10-07 21:41:08.687636738 +0000 UTC m=+0.154815033 container start 9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.build-date=20251007, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Oct 07 21:41:08 compute-0 python3.9[192879]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Applying nova statedir ownership
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 07 21:41:08 compute-0 nova_compute_init[192923]: INFO:nova_statedir:Nova statedir ownership complete
Oct 07 21:41:08 compute-0 systemd[1]: libpod-9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a.scope: Deactivated successfully.
Oct 07 21:41:08 compute-0 podman[192937]: 2025-10-07 21:41:08.805251155 +0000 UTC m=+0.031104467 container died 9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 21:41:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a-userdata-shm.mount: Deactivated successfully.
Oct 07 21:41:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-f7e421264965b37ed160624bb7b2770a0126be58f83079454944e1e5b1b5e1a8-merged.mount: Deactivated successfully.
Oct 07 21:41:08 compute-0 podman[192937]: 2025-10-07 21:41:08.839640328 +0000 UTC m=+0.065493620 container cleanup 9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a (image=38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.12:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:41:08 compute-0 systemd[1]: libpod-conmon-9882b6bd3ae3bea204900ba4855035be5f4146e43065b892cff5183f0ef88d4a.scope: Deactivated successfully.
Oct 07 21:41:08 compute-0 sudo[192877]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:09 compute-0 sshd-session[158262]: Connection closed by 192.168.122.30 port 43450
Oct 07 21:41:09 compute-0 sshd-session[158259]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:41:09 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Oct 07 21:41:09 compute-0 systemd[1]: session-25.scope: Consumed 2min 39.355s CPU time.
Oct 07 21:41:09 compute-0 systemd-logind[798]: Session 25 logged out. Waiting for processes to exit.
Oct 07 21:41:09 compute-0 systemd-logind[798]: Removed session 25.
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.738 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.738 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.739 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.739 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.916 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.950 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.989 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Oct 07 21:41:09 compute-0 nova_compute[192716]: 2025-10-07 21:41:09.991 2 WARNING oslo_config.cfg [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.019 2 INFO nova.virt.driver [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.129 2 INFO nova.compute.provider_config [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.638 2 DEBUG oslo_concurrency.lockutils [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.638 2 DEBUG oslo_concurrency.lockutils [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.638 2 DEBUG oslo_concurrency.lockutils [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.639 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.639 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.639 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.639 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.640 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.640 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.640 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.640 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.640 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.640 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.641 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.641 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.641 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.641 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.641 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.641 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.642 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.642 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.642 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.642 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.642 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.642 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.643 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.643 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.643 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.643 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.643 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.643 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.644 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.644 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.644 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.644 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.644 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.645 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.645 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.645 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.645 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.645 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.645 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.646 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.646 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.646 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.646 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.646 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.647 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.647 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.647 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.647 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.647 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.648 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.648 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.648 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.648 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.648 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.648 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.649 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.650 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.651 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.652 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.653 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.654 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.655 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.656 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.657 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.658 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.659 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.660 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.661 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.662 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.662 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.662 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.662 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.662 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.663 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.663 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.663 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.663 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.664 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.664 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.664 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.664 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.664 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.664 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.665 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.666 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.666 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.666 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.666 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.666 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.666 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.667 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.668 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.668 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.668 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.668 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.668 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.668 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.669 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.669 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.669 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.669 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.669 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.670 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.671 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.672 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.672 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.672 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.672 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.672 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.672 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.673 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.674 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.675 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.675 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.675 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.675 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.675 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.675 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.676 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.677 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.678 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.678 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.679 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.680 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.680 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.680 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.680 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.680 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.680 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.681 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.682 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.683 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.684 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.685 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.686 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.686 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.686 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.686 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.686 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.686 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.687 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.688 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.689 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.689 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.689 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.689 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.689 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.689 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.690 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.690 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.690 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.690 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.690 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.690 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.691 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.692 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.693 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.693 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.693 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.693 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.693 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.693 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.694 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.695 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.695 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.695 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.695 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.695 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.695 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.696 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.697 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.697 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.697 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.697 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.697 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.698 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.698 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.698 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.698 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.698 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.698 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.699 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.700 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.701 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.701 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.701 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.701 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.701 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.701 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.702 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.702 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.702 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.702 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.702 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.702 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.703 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.704 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.705 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.706 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.707 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.708 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.709 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.709 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.709 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.709 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.709 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.709 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.710 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.711 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.712 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.712 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.712 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.712 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.712 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.712 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.713 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.714 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.715 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.716 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.716 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.716 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.716 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.716 2 WARNING oslo_config.cfg [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 07 21:41:11 compute-0 nova_compute[192716]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 07 21:41:11 compute-0 nova_compute[192716]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 07 21:41:11 compute-0 nova_compute[192716]: and ``live_migration_inbound_addr`` respectively.
Oct 07 21:41:11 compute-0 nova_compute[192716]: ).  Its value may be silently ignored in the future.
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.717 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.717 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.717 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.717 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.718 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.718 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.719 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.719 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.719 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.719 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.720 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.720 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.720 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.721 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.721 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.721 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.722 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.723 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.724 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.725 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.726 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.727 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.728 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.729 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.730 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.731 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.732 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.733 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.734 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.735 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.736 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.737 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.738 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.739 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.739 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.739 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.739 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.739 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.739 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.740 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.741 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.742 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.742 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.742 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.742 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.742 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.742 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.743 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.744 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.744 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.744 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.744 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.744 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.744 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.745 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.746 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.746 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.746 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.746 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.746 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.747 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.747 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.747 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.747 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.747 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.747 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.748 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.749 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.749 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.749 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.749 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.749 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.749 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.750 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.751 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.752 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.753 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.754 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.754 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.754 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.754 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.754 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.754 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.755 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.755 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.755 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.755 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.755 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.755 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.756 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.757 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.757 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.757 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.757 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.757 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.757 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.758 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.759 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.760 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.760 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.760 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.760 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.760 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.760 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.761 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.762 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.763 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.763 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.763 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.763 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.763 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.763 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.764 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.764 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.764 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.764 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.764 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.764 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.765 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.765 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.765 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.765 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.765 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.765 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.766 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.767 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.768 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.769 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.770 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.771 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.771 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.771 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.771 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.771 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.771 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.772 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.773 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.774 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.775 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.776 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.777 2 DEBUG oslo_service.backend._eventlet.service [None req-339bfcf3-375a-490f-8d8a-7486a6476745 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 07 21:41:11 compute-0 nova_compute[192716]: 2025-10-07 21:41:11.778 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251007122402.7278e66.el10)
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.286 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.300 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fa652a6d790> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Oct 07 21:41:12 compute-0 nova_compute[192716]: libvirt:  error : internal error: could not initialize domain event timer
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.301 2 WARNING nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.302 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fa652a6d790> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.304 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.305 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.306 2 INFO nova.utils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] The default thread pool MainProcess.default is initialized
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.307 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.308 2 INFO nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Connection event '1' reason 'None'
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.316 2 INFO nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Libvirt host capabilities <capabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]: 
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <host>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <uuid>bec7e4d5-9195-425b-8d09-52635eb49950</uuid>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <arch>x86_64</arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model>EPYC-Rome-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <vendor>AMD</vendor>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <microcode version='16777317'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <signature family='23' model='49' stepping='0'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='x2apic'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='tsc-deadline'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='osxsave'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='hypervisor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='tsc_adjust'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='spec-ctrl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='stibp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='arch-capabilities'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='cmp_legacy'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='topoext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='virt-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='lbrv'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='tsc-scale'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='vmcb-clean'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='pause-filter'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='pfthreshold'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='svme-addr-chk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='rdctl-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='skip-l1dfl-vmentry'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='mds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature name='pschange-mc-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <pages unit='KiB' size='4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <pages unit='KiB' size='2048'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <pages unit='KiB' size='1048576'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <power_management>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <suspend_mem/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <suspend_disk/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <suspend_hybrid/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </power_management>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <iommu support='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <migration_features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <live/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <uri_transports>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <uri_transport>tcp</uri_transport>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <uri_transport>rdma</uri_transport>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </uri_transports>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </migration_features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <topology>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <cells num='1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <cell id='0'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           <memory unit='KiB'>7864100</memory>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           <pages unit='KiB' size='4'>1966025</pages>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           <pages unit='KiB' size='2048'>0</pages>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           <distances>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <sibling id='0' value='10'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           </distances>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           <cpus num='8'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:           </cpus>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         </cell>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </cells>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </topology>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <cache>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </cache>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <secmodel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model>selinux</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <doi>0</doi>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </secmodel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <secmodel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model>dac</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <doi>0</doi>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </secmodel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </host>
Oct 07 21:41:12 compute-0 nova_compute[192716]: 
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <guest>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <os_type>hvm</os_type>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <arch name='i686'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <wordsize>32</wordsize>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <domain type='qemu'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <domain type='kvm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <pae/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <nonpae/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <acpi default='on' toggle='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <apic default='on' toggle='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <cpuselection/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <deviceboot/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <disksnapshot default='on' toggle='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <externalSnapshot/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </guest>
Oct 07 21:41:12 compute-0 nova_compute[192716]: 
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <guest>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <os_type>hvm</os_type>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <arch name='x86_64'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <wordsize>64</wordsize>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <domain type='qemu'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <domain type='kvm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <acpi default='on' toggle='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <apic default='on' toggle='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <cpuselection/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <deviceboot/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <disksnapshot default='on' toggle='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <externalSnapshot/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </guest>
Oct 07 21:41:12 compute-0 nova_compute[192716]: 
Oct 07 21:41:12 compute-0 nova_compute[192716]: </capabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]: 
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.321 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.341 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 07 21:41:12 compute-0 nova_compute[192716]: <domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <path>/usr/libexec/qemu-kvm</path>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <domain>kvm</domain>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <arch>i686</arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <vcpu max='240'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <iothreads supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <os supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='firmware'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <loader supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>rom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pflash</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='readonly'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>yes</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='secure'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </loader>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </os>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-passthrough' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='hostPassthroughMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='maximum' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='maximumMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-model' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <vendor>AMD</vendor>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='x2apic'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-deadline'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='hypervisor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc_adjust'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='spec-ctrl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='stibp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='arch-capabilities'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='cmp_legacy'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='overflow-recov'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='succor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='amd-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='virt-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lbrv'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-scale'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='vmcb-clean'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='flushbyasid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pause-filter'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pfthreshold'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='svme-addr-chk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rdctl-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='mds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pschange-mc-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='gds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rfds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='disable' name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='custom' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Dhyana-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-128'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-256'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-512'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v6'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v7'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <memoryBacking supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='sourceType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>file</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>anonymous</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>memfd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </memoryBacking>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <disk supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='diskDevice'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>disk</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cdrom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>floppy</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>lun</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ide</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>fdc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>sata</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <graphics supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vnc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egl-headless</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>dbus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <video supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='modelType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vga</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cirrus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>none</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>bochs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ramfb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </video>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hostdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='mode'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>subsystem</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='startupPolicy'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>mandatory</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>requisite</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>optional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='subsysType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pci</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='capsType'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='pciBackend'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hostdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <rng supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>random</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <filesystem supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='driverType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>path</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>handle</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtiofs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </filesystem>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <tpm supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-tis</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-crb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emulator</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>external</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendVersion'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>2.0</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </tpm>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <redirdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </redirdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <channel supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pty</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>unix</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </channel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <crypto supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>qemu</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </crypto>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <interface supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>passt</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <panic supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>isa</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>hyperv</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </panic>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <gic supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <vmcoreinfo supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <genid supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backingStoreInput supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backup supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <async-teardown supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <ps2 supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sev supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sgx supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hyperv supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='features'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>relaxed</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vapic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>spinlocks</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vpindex</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>runtime</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>synic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>stimer</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reset</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vendor_id</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>frequencies</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reenlightenment</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tlbflush</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ipi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>avic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emsr_bitmap</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>xmm_input</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hyperv>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <launchSecurity supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </features>
Oct 07 21:41:12 compute-0 nova_compute[192716]: </domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.348 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 07 21:41:12 compute-0 nova_compute[192716]: <domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <path>/usr/libexec/qemu-kvm</path>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <domain>kvm</domain>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <arch>i686</arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <vcpu max='4096'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <iothreads supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <os supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='firmware'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <loader supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>rom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pflash</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='readonly'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>yes</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='secure'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </loader>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </os>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-passthrough' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='hostPassthroughMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='maximum' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='maximumMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-model' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <vendor>AMD</vendor>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='x2apic'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-deadline'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='hypervisor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc_adjust'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='spec-ctrl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='stibp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='arch-capabilities'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='cmp_legacy'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='overflow-recov'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='succor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='amd-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='virt-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lbrv'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-scale'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='vmcb-clean'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='flushbyasid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pause-filter'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pfthreshold'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='svme-addr-chk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rdctl-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='mds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pschange-mc-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='gds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rfds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='disable' name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='custom' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Dhyana-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-128'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-256'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-512'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v6'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v7'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <memoryBacking supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='sourceType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>file</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>anonymous</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>memfd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </memoryBacking>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <disk supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='diskDevice'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>disk</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cdrom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>floppy</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>lun</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>fdc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>sata</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <graphics supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vnc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egl-headless</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>dbus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <video supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='modelType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vga</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cirrus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>none</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>bochs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ramfb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </video>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hostdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='mode'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>subsystem</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='startupPolicy'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>mandatory</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>requisite</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>optional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='subsysType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pci</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='capsType'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='pciBackend'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hostdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <rng supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>random</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <filesystem supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='driverType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>path</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>handle</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtiofs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </filesystem>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <tpm supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-tis</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-crb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emulator</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>external</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendVersion'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>2.0</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </tpm>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <redirdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </redirdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <channel supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pty</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>unix</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </channel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <crypto supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>qemu</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </crypto>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <interface supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>passt</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <panic supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>isa</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>hyperv</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </panic>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <gic supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <vmcoreinfo supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <genid supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backingStoreInput supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backup supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <async-teardown supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <ps2 supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sev supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sgx supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hyperv supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='features'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>relaxed</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vapic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>spinlocks</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vpindex</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>runtime</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>synic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>stimer</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reset</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vendor_id</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>frequencies</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reenlightenment</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tlbflush</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ipi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>avic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emsr_bitmap</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>xmm_input</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hyperv>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <launchSecurity supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </features>
Oct 07 21:41:12 compute-0 nova_compute[192716]: </domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.381 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.388 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 07 21:41:12 compute-0 nova_compute[192716]: <domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <path>/usr/libexec/qemu-kvm</path>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <domain>kvm</domain>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <arch>x86_64</arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <vcpu max='240'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <iothreads supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <os supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='firmware'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <loader supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>rom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pflash</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='readonly'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>yes</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='secure'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </loader>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </os>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-passthrough' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='hostPassthroughMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='maximum' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='maximumMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-model' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <vendor>AMD</vendor>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='x2apic'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-deadline'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='hypervisor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc_adjust'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='spec-ctrl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='stibp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='arch-capabilities'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='cmp_legacy'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='overflow-recov'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='succor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='amd-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='virt-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lbrv'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-scale'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='vmcb-clean'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='flushbyasid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pause-filter'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pfthreshold'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='svme-addr-chk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rdctl-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='mds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pschange-mc-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='gds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rfds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='disable' name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='custom' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Dhyana-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-128'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-256'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-512'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v6'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v7'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <memoryBacking supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='sourceType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>file</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>anonymous</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>memfd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </memoryBacking>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <disk supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='diskDevice'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>disk</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cdrom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>floppy</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>lun</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ide</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>fdc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>sata</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <graphics supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vnc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egl-headless</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>dbus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <video supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='modelType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vga</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cirrus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>none</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>bochs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ramfb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </video>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hostdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='mode'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>subsystem</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='startupPolicy'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>mandatory</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>requisite</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>optional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='subsysType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pci</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='capsType'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='pciBackend'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hostdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <rng supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>random</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <filesystem supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='driverType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>path</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>handle</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtiofs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </filesystem>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <tpm supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-tis</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-crb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emulator</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>external</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendVersion'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>2.0</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </tpm>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <redirdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </redirdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <channel supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pty</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>unix</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </channel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <crypto supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>qemu</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </crypto>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <interface supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>passt</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <panic supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>isa</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>hyperv</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </panic>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <gic supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <vmcoreinfo supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <genid supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backingStoreInput supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backup supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <async-teardown supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <ps2 supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sev supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sgx supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hyperv supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='features'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>relaxed</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vapic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>spinlocks</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vpindex</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>runtime</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>synic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>stimer</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reset</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vendor_id</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>frequencies</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reenlightenment</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tlbflush</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ipi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>avic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emsr_bitmap</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>xmm_input</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hyperv>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <launchSecurity supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </features>
Oct 07 21:41:12 compute-0 nova_compute[192716]: </domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.442 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 07 21:41:12 compute-0 nova_compute[192716]: <domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <path>/usr/libexec/qemu-kvm</path>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <domain>kvm</domain>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <arch>x86_64</arch>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <vcpu max='4096'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <iothreads supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <os supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='firmware'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>efi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <loader supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>rom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pflash</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='readonly'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>yes</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='secure'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>yes</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>no</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </loader>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </os>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-passthrough' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='hostPassthroughMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='maximum' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='maximumMigratable'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>on</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>off</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='host-model' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <vendor>AMD</vendor>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='x2apic'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-deadline'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='hypervisor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc_adjust'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='spec-ctrl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='stibp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='arch-capabilities'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='cmp_legacy'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='overflow-recov'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='succor'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='amd-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='virt-ssbd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lbrv'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='tsc-scale'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='vmcb-clean'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='flushbyasid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pause-filter'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pfthreshold'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='svme-addr-chk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rdctl-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='mds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='pschange-mc-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='gds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='require' name='rfds-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <feature policy='disable' name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <mode name='custom' supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Broadwell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cascadelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Cooperlake-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Denverton-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Dhyana-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Genoa-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='auto-ibrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Milan-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amd-psfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='no-nested-data-bp'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='null-sel-clr-base'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='stibp-always-on'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-Rome-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='EPYC-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='GraniteRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-128'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-256'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx10-512'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='prefetchiti'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Haswell-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-noTSX'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v6'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Icelake-Server-v7'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='IvyBridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='KnightsMill-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4fmaps'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-4vnniw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512er'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512pf'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G4-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Opteron_G5-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fma4'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tbm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xop'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SapphireRapids-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='amx-tile'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-bf16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-fp16'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512-vpopcntdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bitalg'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vbmi2'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrc'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fzrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='la57'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='taa-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='tsx-ldtrk'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xfd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='SierraForest-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ifma'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-ne-convert'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx-vnni-int8'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='bus-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cmpccxadd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fbsdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='fsrs'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ibrs-all'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mcdt-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pbrsb-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='psdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='sbdr-ssdp-no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='serialize'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vaes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='vpclmulqdq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 07 21:41:12 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Client-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='hle'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='rtm'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Skylake-Server-v5'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512bw'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512cd'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512dq'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512f'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='avx512vl'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='invpcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pcid'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='pku'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='mpx'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v2'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v3'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='core-capability'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='split-lock-detect'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='Snowridge-v4'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='cldemote'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='erms'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='gfni'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdir64b'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='movdiri'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='xsaves'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='athlon-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='core2duo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='coreduo-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='n270-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='ss'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <blockers model='phenom-v1'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnow'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <feature name='3dnowext'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </blockers>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </mode>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <memoryBacking supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <enum name='sourceType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>file</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>anonymous</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <value>memfd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </memoryBacking>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <disk supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='diskDevice'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>disk</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cdrom</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>floppy</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>lun</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>fdc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>sata</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <graphics supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vnc</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egl-headless</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>dbus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <video supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='modelType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vga</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>cirrus</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>none</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>bochs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ramfb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </video>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hostdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='mode'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>subsystem</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='startupPolicy'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>mandatory</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>requisite</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>optional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='subsysType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pci</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>scsi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='capsType'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='pciBackend'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hostdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <rng supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtio-non-transitional</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>random</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>egd</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <filesystem supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='driverType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>path</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>handle</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>virtiofs</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </filesystem>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <tpm supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-tis</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tpm-crb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emulator</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>external</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendVersion'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>2.0</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </tpm>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <redirdev supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='bus'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>usb</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </redirdev>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <channel supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>pty</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>unix</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </channel>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <crypto supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='type'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>qemu</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendModel'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>builtin</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </crypto>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <interface supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='backendType'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>default</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>passt</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <panic supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='model'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>isa</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>hyperv</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </panic>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   <features>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <gic supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <vmcoreinfo supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <genid supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backingStoreInput supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <backup supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <async-teardown supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <ps2 supported='yes'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sev supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <sgx supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <hyperv supported='yes'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       <enum name='features'>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>relaxed</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vapic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>spinlocks</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vpindex</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>runtime</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>synic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>stimer</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reset</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>vendor_id</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>frequencies</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>reenlightenment</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>tlbflush</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>ipi</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>avic</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>emsr_bitmap</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:         <value>xmm_input</value>
Oct 07 21:41:12 compute-0 nova_compute[192716]:       </enum>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     </hyperv>
Oct 07 21:41:12 compute-0 nova_compute[192716]:     <launchSecurity supported='no'/>
Oct 07 21:41:12 compute-0 nova_compute[192716]:   </features>
Oct 07 21:41:12 compute-0 nova_compute[192716]: </domainCapabilities>
Oct 07 21:41:12 compute-0 nova_compute[192716]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.498 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.499 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.499 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.499 2 INFO nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Secure Boot support detected
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.506 2 INFO nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.506 2 INFO nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.671 2 DEBUG nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.814 2 WARNING nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 07 21:41:12 compute-0 nova_compute[192716]: 2025-10-07 21:41:12.815 2 DEBUG nova.virt.libvirt.volume.mount [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 07 21:41:13 compute-0 nova_compute[192716]: 2025-10-07 21:41:13.185 2 INFO nova.virt.node [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Determined node identity 19d1aa8e-e3fb-43ab-9849-122569e48a32 from /var/lib/nova/compute_id
Oct 07 21:41:13 compute-0 nova_compute[192716]: 2025-10-07 21:41:13.694 2 WARNING nova.compute.manager [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Compute nodes ['19d1aa8e-e3fb-43ab-9849-122569e48a32'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Oct 07 21:41:14 compute-0 nova_compute[192716]: 2025-10-07 21:41:14.707 2 INFO nova.compute.manager [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 07 21:41:15 compute-0 sshd-session[193036]: Accepted publickey for zuul from 192.168.122.30 port 46074 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 21:41:15 compute-0 systemd-logind[798]: New session 28 of user zuul.
Oct 07 21:41:15 compute-0 systemd[1]: Started Session 28 of User zuul.
Oct 07 21:41:15 compute-0 sshd-session[193036]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.725 2 WARNING nova.compute.manager [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.726 2 DEBUG oslo_concurrency.lockutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.726 2 DEBUG oslo_concurrency.lockutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.727 2 DEBUG oslo_concurrency.lockutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.727 2 DEBUG nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.909 2 WARNING nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.910 2 DEBUG oslo_concurrency.processutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.928 2 DEBUG oslo_concurrency.processutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.929 2 DEBUG nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6211MB free_disk=73.51190567016602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.929 2 DEBUG oslo_concurrency.lockutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:41:15 compute-0 nova_compute[192716]: 2025-10-07 21:41:15.929 2 DEBUG oslo_concurrency.lockutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:41:16 compute-0 python3.9[193189]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 07 21:41:16 compute-0 nova_compute[192716]: 2025-10-07 21:41:16.435 2 WARNING nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] No compute node record for compute-0.ctlplane.example.com:19d1aa8e-e3fb-43ab-9849-122569e48a32: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 19d1aa8e-e3fb-43ab-9849-122569e48a32 could not be found.
Oct 07 21:41:16 compute-0 nova_compute[192716]: 2025-10-07 21:41:16.946 2 INFO nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 19d1aa8e-e3fb-43ab-9849-122569e48a32
Oct 07 21:41:17 compute-0 sudo[193344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwtaswazywqkttfbqghgwmeunutmzeye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873276.9922445-52-206904765364822/AnsiballZ_systemd_service.py'
Oct 07 21:41:17 compute-0 sudo[193344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:18 compute-0 python3.9[193346]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:41:18 compute-0 systemd[1]: Reloading.
Oct 07 21:41:18 compute-0 systemd-rc-local-generator[193391]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:41:18 compute-0 systemd-sysv-generator[193396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:41:18 compute-0 podman[193349]: 2025-10-07 21:41:18.267984893 +0000 UTC m=+0.119018080 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 21:41:18 compute-0 podman[193348]: 2025-10-07 21:41:18.279060686 +0000 UTC m=+0.126641822 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible)
Oct 07 21:41:18 compute-0 sudo[193344]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:18 compute-0 nova_compute[192716]: 2025-10-07 21:41:18.474 2 DEBUG nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:41:18 compute-0 nova_compute[192716]: 2025-10-07 21:41:18.475 2 DEBUG nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:41:15 up 50 min,  0 user,  load average: 1.01, 0.89, 0.65\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.330 2 INFO nova.scheduler.client.report [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] [req-85c0ea9a-d8dd-4fa8-bded-3e56194f40f3] Created resource provider record via placement API for resource provider with UUID 19d1aa8e-e3fb-43ab-9849-122569e48a32 and name compute-0.ctlplane.example.com.
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.354 2 DEBUG nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 07 21:41:19 compute-0 nova_compute[192716]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.355 2 INFO nova.virt.libvirt.host [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] kernel doesn't support AMD SEV
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.356 2 DEBUG nova.compute.provider_tree [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.356 2 DEBUG nova.virt.libvirt.driver [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:41:19 compute-0 python3.9[193568]: ansible-ansible.builtin.service_facts Invoked
Oct 07 21:41:19 compute-0 network[193585]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 07 21:41:19 compute-0 network[193586]: 'network-scripts' will be removed from distribution in near future.
Oct 07 21:41:19 compute-0 network[193587]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.937 2 DEBUG nova.scheduler.client.report [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Updated inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.941 2 DEBUG nova.compute.provider_tree [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 21:41:19 compute-0 nova_compute[192716]: 2025-10-07 21:41:19.941 2 DEBUG nova.compute.provider_tree [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:41:20 compute-0 nova_compute[192716]: 2025-10-07 21:41:20.107 2 DEBUG nova.compute.provider_tree [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 21:41:20 compute-0 nova_compute[192716]: 2025-10-07 21:41:20.617 2 DEBUG nova.compute.resource_tracker [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:41:20 compute-0 nova_compute[192716]: 2025-10-07 21:41:20.617 2 DEBUG oslo_concurrency.lockutils [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.688s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:41:20 compute-0 nova_compute[192716]: 2025-10-07 21:41:20.618 2 DEBUG nova.service [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Oct 07 21:41:20 compute-0 nova_compute[192716]: 2025-10-07 21:41:20.726 2 DEBUG nova.service [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Oct 07 21:41:20 compute-0 nova_compute[192716]: 2025-10-07 21:41:20.727 2 DEBUG nova.servicegroup.drivers.db [None req-03a9d3b0-079e-4ca8-a359-3341fb98af42 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Oct 07 21:41:24 compute-0 sudo[193862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqmwybooenjzvmueaolqyddameraqahw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873284.628248-90-138495370129964/AnsiballZ_systemd_service.py'
Oct 07 21:41:24 compute-0 sudo[193862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:25 compute-0 python3.9[193864]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:41:25 compute-0 sudo[193862]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:41:25.581 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:41:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:41:25.582 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:41:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:41:25.583 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:41:26 compute-0 sudo[194016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyzcmvnkjvwvjrbudrmeqilzvuxasxwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873285.662226-110-237313759601935/AnsiballZ_file.py'
Oct 07 21:41:26 compute-0 sudo[194016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:26 compute-0 python3.9[194018]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:26 compute-0 sudo[194016]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:26 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:41:26 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 21:41:26 compute-0 sudo[194169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdkrsffmxjbnqpwaaazzcunfyxfngfrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873286.6477995-126-189779207812971/AnsiballZ_file.py'
Oct 07 21:41:26 compute-0 sudo[194169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:27 compute-0 python3.9[194171]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:27 compute-0 sudo[194169]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:27 compute-0 sudo[194321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atmygjxtimehhjngsvnwttodwrafeujh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873287.5174756-144-103277550498930/AnsiballZ_command.py'
Oct 07 21:41:27 compute-0 sudo[194321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:28 compute-0 python3.9[194323]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:41:28 compute-0 sudo[194321]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:28 compute-0 podman[194402]: 2025-10-07 21:41:28.883042734 +0000 UTC m=+0.118806340 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:41:29 compute-0 python3.9[194502]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 07 21:41:29 compute-0 auditd[705]: Audit daemon rotating log files
Oct 07 21:41:29 compute-0 sudo[194652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlygzsrtzksslmxgbatciwhvnhcvsezr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873289.5733275-180-1685387906694/AnsiballZ_systemd_service.py'
Oct 07 21:41:29 compute-0 sudo[194652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:30 compute-0 python3.9[194654]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:41:30 compute-0 systemd[1]: Reloading.
Oct 07 21:41:30 compute-0 systemd-rc-local-generator[194681]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:41:30 compute-0 systemd-sysv-generator[194685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:41:30 compute-0 sudo[194652]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:31 compute-0 sudo[194838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycnnquqifksxploicduzstxxfunzgyie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873290.7953184-196-228816308085493/AnsiballZ_command.py'
Oct 07 21:41:31 compute-0 sudo[194838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:31 compute-0 python3.9[194840]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:41:31 compute-0 sudo[194838]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:31 compute-0 sudo[194991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udnnqooxppwmmzqkrdzgiytvwiwajmlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873291.6713939-214-281462621118546/AnsiballZ_file.py'
Oct 07 21:41:31 compute-0 sudo[194991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:32 compute-0 python3.9[194993]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:41:32 compute-0 sudo[194991]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:33 compute-0 python3.9[195143]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:41:33 compute-0 podman[195269]: 2025-10-07 21:41:33.826253427 +0000 UTC m=+0.064793959 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 07 21:41:33 compute-0 python3.9[195306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:34 compute-0 python3.9[195435]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873293.4475946-246-36900713740684/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:41:35 compute-0 sudo[195585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyfsvqwvyyfwkotolsbhjmuvcevenbhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873295.04236-276-68251377984934/AnsiballZ_group.py'
Oct 07 21:41:35 compute-0 sudo[195585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:35 compute-0 python3.9[195587]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct 07 21:41:35 compute-0 sudo[195585]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:36 compute-0 sudo[195737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlqgoabwosibldyghtowlmkbvoszaubu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873296.2105846-298-223038777958693/AnsiballZ_getent.py'
Oct 07 21:41:36 compute-0 sudo[195737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:36 compute-0 python3.9[195739]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct 07 21:41:37 compute-0 sudo[195737]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:37 compute-0 sudo[195890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peelxrneihbsvnlxmmlenapapougxiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873297.3049095-314-261473858162688/AnsiballZ_group.py'
Oct 07 21:41:37 compute-0 sudo[195890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:37 compute-0 python3.9[195892]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 07 21:41:37 compute-0 groupadd[195893]: group added to /etc/group: name=ceilometer, GID=42405
Oct 07 21:41:37 compute-0 groupadd[195893]: group added to /etc/gshadow: name=ceilometer
Oct 07 21:41:37 compute-0 groupadd[195893]: new group: name=ceilometer, GID=42405
Oct 07 21:41:37 compute-0 sudo[195890]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:38 compute-0 sudo[196048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbbkjaoauohnacspyumrjllmllcckbbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873298.2241178-330-163137337483789/AnsiballZ_user.py'
Oct 07 21:41:38 compute-0 sudo[196048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:41:39 compute-0 python3.9[196050]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 07 21:41:39 compute-0 useradd[196052]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Oct 07 21:41:39 compute-0 useradd[196052]: add 'ceilometer' to group 'libvirt'
Oct 07 21:41:39 compute-0 useradd[196052]: add 'ceilometer' to shadow group 'libvirt'
Oct 07 21:41:39 compute-0 sudo[196048]: pam_unix(sudo:session): session closed for user root
Oct 07 21:41:40 compute-0 python3.9[196208]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:41 compute-0 python3.9[196329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759873300.0273519-382-84926134837572/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:41 compute-0 python3.9[196479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:42 compute-0 python3.9[196600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759873301.4568388-382-138134658782568/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:43 compute-0 python3.9[196750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:43 compute-0 python3.9[196871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759873302.609437-382-96851250651684/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:44 compute-0 python3.9[197021]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:41:45 compute-0 python3.9[197173]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:41:46 compute-0 python3.9[197325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:47 compute-0 python3.9[197446]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873305.9398148-500-90745310245868/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:47 compute-0 python3.9[197596]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:48 compute-0 python3.9[197672]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:48 compute-0 podman[197773]: 2025-10-07 21:41:48.860816545 +0000 UTC m=+0.085873770 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 21:41:48 compute-0 podman[197772]: 2025-10-07 21:41:48.870785739 +0000 UTC m=+0.097778931 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 21:41:49 compute-0 python3.9[197860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:49 compute-0 python3.9[197981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873308.5186396-500-121262848531466/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=54faed0c4541f23a28bb8408b81fcfb12456f723 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:50 compute-0 python3.9[198132]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:50 compute-0 python3.9[198253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873309.8550608-500-178288034046478/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:51 compute-0 python3.9[198404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:52 compute-0 python3.9[198525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873311.133412-500-206884654660/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:52 compute-0 python3.9[198675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:53 compute-0 python3.9[198796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873312.4400475-500-168063691863315/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:54 compute-0 python3.9[198946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:54 compute-0 python3.9[199067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873313.7639582-500-50949947154802/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:55 compute-0 python3.9[199217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:56 compute-0 python3.9[199338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873315.1707954-500-73519273753164/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:57 compute-0 python3.9[199488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:57 compute-0 python3.9[199609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873316.535867-500-88939505925221/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:58 compute-0 python3.9[199759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:41:59 compute-0 python3.9[199880]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873317.880945-500-89170201531564/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:41:59 compute-0 podman[199881]: 2025-10-07 21:41:59.186869842 +0000 UTC m=+0.093175365 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 21:41:59 compute-0 python3.9[200057]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:42:00 compute-0 python3.9[200178]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873319.2447457-500-119983323218041/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:01 compute-0 python3.9[200328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:42:02 compute-0 python3.9[200404]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:03 compute-0 python3.9[200554]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:42:03 compute-0 python3.9[200630]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:04 compute-0 podman[200754]: 2025-10-07 21:42:04.316973982 +0000 UTC m=+0.052905860 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 07 21:42:04 compute-0 python3.9[200799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:42:04 compute-0 python3.9[200875]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:05 compute-0 sudo[201025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qivhuwtrvtgvahawfiistctjmcetdbvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873325.288166-878-237633186558795/AnsiballZ_file.py'
Oct 07 21:42:05 compute-0 sudo[201025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:05 compute-0 nova_compute[192716]: 2025-10-07 21:42:05.728 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:05 compute-0 python3.9[201027]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:05 compute-0 sudo[201025]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:06 compute-0 nova_compute[192716]: 2025-10-07 21:42:06.269 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:06 compute-0 sudo[201177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmnhtgprqcejmmvfwlvrnxzkfckbnzmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873326.0882483-894-264351436885689/AnsiballZ_file.py'
Oct 07 21:42:06 compute-0 sudo[201177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:06 compute-0 python3.9[201179]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:06 compute-0 sudo[201177]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:07 compute-0 sudo[201329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usegpvbiybzwamdvwzpijsjynvmqfxaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873326.8846598-910-272112014967313/AnsiballZ_file.py'
Oct 07 21:42:07 compute-0 sudo[201329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:07 compute-0 python3.9[201331]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:42:07 compute-0 sudo[201329]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:08 compute-0 sudo[201481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goreubhaepcrztdejkhzypvvxwifejqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873327.864062-926-31191935592845/AnsiballZ_systemd_service.py'
Oct 07 21:42:08 compute-0 sudo[201481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:08 compute-0 python3.9[201483]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:42:08 compute-0 systemd[1]: Reloading.
Oct 07 21:42:08 compute-0 systemd-rc-local-generator[201508]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:42:08 compute-0 systemd-sysv-generator[201511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:42:08 compute-0 systemd[1]: Listening on Podman API Socket.
Oct 07 21:42:09 compute-0 sudo[201481]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:09 compute-0 sudo[201671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqnngfjthtdpvtgyrwjovumpitdxtqgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873329.3769205-944-85365273850835/AnsiballZ_stat.py'
Oct 07 21:42:09 compute-0 sudo[201671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:09 compute-0 python3.9[201673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:42:09 compute-0 sudo[201671]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.992 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.993 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.993 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.993 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.994 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.994 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.994 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.995 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:42:09 compute-0 nova_compute[192716]: 2025-10-07 21:42:09.995 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:42:10 compute-0 sudo[201794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhrglaylpgkpwjsmazdjeproiqhhdqzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873329.3769205-944-85365273850835/AnsiballZ_copy.py'
Oct 07 21:42:10 compute-0 sudo[201794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.510 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.683 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.685 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:42:10 compute-0 python3.9[201796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873329.3769205-944-85365273850835/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.714 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.715 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6180MB free_disk=73.51132583618164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.715 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:42:10 compute-0 nova_compute[192716]: 2025-10-07 21:42:10.715 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:42:10 compute-0 sudo[201794]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:11 compute-0 sudo[201947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hszfngqfloaxewwqupszckxvxbgkjgyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873331.0986748-978-61972944853166/AnsiballZ_container_config_data.py'
Oct 07 21:42:11 compute-0 sudo[201947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:11 compute-0 nova_compute[192716]: 2025-10-07 21:42:11.772 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:42:11 compute-0 nova_compute[192716]: 2025-10-07 21:42:11.773 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:42:10 up 51 min,  0 user,  load average: 0.82, 0.85, 0.65\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:42:11 compute-0 nova_compute[192716]: 2025-10-07 21:42:11.863 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:42:11 compute-0 python3.9[201949]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct 07 21:42:11 compute-0 sudo[201947]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:12 compute-0 nova_compute[192716]: 2025-10-07 21:42:12.373 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:42:12 compute-0 sshd-session[198131]: Invalid user ftpuser from 27.79.44.171 port 57238
Oct 07 21:42:12 compute-0 sudo[202100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akngohtezhkaradmwqnmkrooedxskksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873332.2625885-996-226049069479263/AnsiballZ_container_config_hash.py'
Oct 07 21:42:12 compute-0 sudo[202100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:12 compute-0 nova_compute[192716]: 2025-10-07 21:42:12.885 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:42:12 compute-0 nova_compute[192716]: 2025-10-07 21:42:12.886 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.171s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:42:13 compute-0 python3.9[202102]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:42:13 compute-0 sudo[202100]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:13 compute-0 sudo[202253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxtajonajmzyhcfxqicyxxructohxfxj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873333.3723028-1016-177811761423515/AnsiballZ_edpm_container_manage.py'
Oct 07 21:42:13 compute-0 sudo[202253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:13 compute-0 sshd-session[198131]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:42:13 compute-0 sshd-session[198131]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=27.79.44.171
Oct 07 21:42:14 compute-0 python3[202255]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:42:14 compute-0 sshd-session[202074]: Invalid user username from 116.110.151.5 port 47534
Oct 07 21:42:14 compute-0 sshd-session[202074]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:42:14 compute-0 sshd-session[202074]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:42:15 compute-0 podman[202269]: 2025-10-07 21:42:15.64421796 +0000 UTC m=+1.356556749 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 07 21:42:15 compute-0 sshd-session[198131]: Failed password for invalid user ftpuser from 27.79.44.171 port 57238 ssh2
Oct 07 21:42:15 compute-0 podman[202367]: 2025-10-07 21:42:15.842109009 +0000 UTC m=+0.055709452 container create 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Oct 07 21:42:15 compute-0 podman[202367]: 2025-10-07 21:42:15.815664 +0000 UTC m=+0.029264523 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 07 21:42:15 compute-0 python3[202255]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct 07 21:42:16 compute-0 sudo[202253]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:16 compute-0 sshd-session[202074]: Failed password for invalid user username from 116.110.151.5 port 47534 ssh2
Oct 07 21:42:16 compute-0 sudo[202555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eylulhwnkyzzgjmrddtpswblpmmmelat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873336.264292-1032-108781544442132/AnsiballZ_stat.py'
Oct 07 21:42:16 compute-0 sudo[202555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:16 compute-0 python3.9[202557]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:42:16 compute-0 sudo[202555]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:17 compute-0 sshd-session[202074]: Connection closed by invalid user username 116.110.151.5 port 47534 [preauth]
Oct 07 21:42:17 compute-0 sudo[202709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vopzlgycxngbbvpmhauvvqbuwrlfqwyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873337.2634494-1050-113804719629472/AnsiballZ_file.py'
Oct 07 21:42:17 compute-0 sudo[202709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:17 compute-0 python3.9[202711]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:17 compute-0 sudo[202709]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:18 compute-0 sudo[202860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwjlgsvjelkjjsfzudzbzemkkgkzxole ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873337.9269643-1050-246020841478293/AnsiballZ_copy.py'
Oct 07 21:42:18 compute-0 sudo[202860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:18 compute-0 python3.9[202862]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759873337.9269643-1050-246020841478293/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:18 compute-0 sudo[202860]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:19 compute-0 sudo[202958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-menlqpysvgcoesoemuxjmxemixqgurtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873337.9269643-1050-246020841478293/AnsiballZ_systemd.py'
Oct 07 21:42:19 compute-0 sudo[202958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:19 compute-0 podman[202911]: 2025-10-07 21:42:19.358393192 +0000 UTC m=+0.095824424 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 21:42:19 compute-0 podman[202910]: 2025-10-07 21:42:19.388286833 +0000 UTC m=+0.125789447 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 21:42:19 compute-0 python3.9[202969]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:42:19 compute-0 systemd[1]: Reloading.
Oct 07 21:42:19 compute-0 systemd-rc-local-generator[203004]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:42:19 compute-0 systemd-sysv-generator[203008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:42:20 compute-0 sudo[202958]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:20 compute-0 sudo[203084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjqeuyelllxybepgxhnznzqgfjwozsbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873337.9269643-1050-246020841478293/AnsiballZ_systemd.py'
Oct 07 21:42:20 compute-0 sudo[203084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:20 compute-0 python3.9[203086]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:42:20 compute-0 systemd[1]: Reloading.
Oct 07 21:42:20 compute-0 sshd-session[198131]: Connection closed by invalid user ftpuser 27.79.44.171 port 57238 [preauth]
Oct 07 21:42:20 compute-0 systemd-sysv-generator[203118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:42:20 compute-0 systemd-rc-local-generator[203113]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:42:21 compute-0 systemd[1]: Starting podman_exporter container...
Oct 07 21:42:21 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:42:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7153d272ff3d9e7321d8366588135f18c761fac84148707746d2df4765f8c277/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7153d272ff3d9e7321d8366588135f18c761fac84148707746d2df4765f8c277/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:21 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.
Oct 07 21:42:21 compute-0 podman[203126]: 2025-10-07 21:42:21.301646871 +0000 UTC m=+0.165365112 container init 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.326Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.326Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.327Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.327Z caller=handler.go:105 level=info collector=container
Oct 07 21:42:21 compute-0 podman[203126]: 2025-10-07 21:42:21.336515558 +0000 UTC m=+0.200233799 container start 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:42:21 compute-0 podman[203126]: podman_exporter
Oct 07 21:42:21 compute-0 systemd[1]: Starting Podman API Service...
Oct 07 21:42:21 compute-0 systemd[1]: Started Podman API Service.
Oct 07 21:42:21 compute-0 systemd[1]: Started podman_exporter container.
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="Setting parallel job count to 25"
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="Using sqlite as database backend"
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct 07 21:42:21 compute-0 sudo[203084]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:21 compute-0 podman[203153]: @ - - [07/Oct/2025:21:42:21 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 07 21:42:21 compute-0 podman[203153]: time="2025-10-07T21:42:21Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:42:21 compute-0 podman[203151]: 2025-10-07 21:42:21.42962951 +0000 UTC m=+0.076187525 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:42:21 compute-0 systemd[1]: 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb-3b10c9d8ed1993bc.service: Main process exited, code=exited, status=1/FAILURE
Oct 07 21:42:21 compute-0 systemd[1]: 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb-3b10c9d8ed1993bc.service: Failed with result 'exit-code'.
Oct 07 21:42:21 compute-0 podman[203153]: @ - - [07/Oct/2025:21:42:21 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16537 "" "Go-http-client/1.1"
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.445Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.445Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 07 21:42:21 compute-0 podman_exporter[203142]: ts=2025-10-07T21:42:21.446Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 07 21:42:22 compute-0 sudo[203339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trwyuywpaldqomdqhcenmgbmxhmfrcjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873341.621909-1098-44095495866933/AnsiballZ_systemd.py'
Oct 07 21:42:22 compute-0 sudo[203339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:22 compute-0 python3.9[203341]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:42:22 compute-0 systemd[1]: Stopping podman_exporter container...
Oct 07 21:42:22 compute-0 podman[203153]: @ - - [07/Oct/2025:21:42:21 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct 07 21:42:22 compute-0 systemd[1]: libpod-9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.scope: Deactivated successfully.
Oct 07 21:42:22 compute-0 podman[203345]: 2025-10-07 21:42:22.476047263 +0000 UTC m=+0.074514166 container died 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:42:22 compute-0 systemd[1]: 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb-3b10c9d8ed1993bc.timer: Deactivated successfully.
Oct 07 21:42:22 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.
Oct 07 21:42:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb-userdata-shm.mount: Deactivated successfully.
Oct 07 21:42:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-7153d272ff3d9e7321d8366588135f18c761fac84148707746d2df4765f8c277-merged.mount: Deactivated successfully.
Oct 07 21:42:22 compute-0 podman[203345]: 2025-10-07 21:42:22.726083038 +0000 UTC m=+0.324549951 container cleanup 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:42:22 compute-0 podman[203345]: podman_exporter
Oct 07 21:42:22 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 07 21:42:22 compute-0 podman[203375]: podman_exporter
Oct 07 21:42:22 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct 07 21:42:22 compute-0 systemd[1]: Stopped podman_exporter container.
Oct 07 21:42:22 compute-0 systemd[1]: Starting podman_exporter container...
Oct 07 21:42:22 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:42:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7153d272ff3d9e7321d8366588135f18c761fac84148707746d2df4765f8c277/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7153d272ff3d9e7321d8366588135f18c761fac84148707746d2df4765f8c277/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:22 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.
Oct 07 21:42:22 compute-0 podman[203388]: 2025-10-07 21:42:22.983911603 +0000 UTC m=+0.153678668 container init 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.005Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.005Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.005Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.005Z caller=handler.go:105 level=info collector=container
Oct 07 21:42:23 compute-0 podman[203153]: @ - - [07/Oct/2025:21:42:23 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 07 21:42:23 compute-0 podman[203153]: time="2025-10-07T21:42:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:42:23 compute-0 podman[203388]: 2025-10-07 21:42:23.013648809 +0000 UTC m=+0.183415814 container start 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:42:23 compute-0 podman[203388]: podman_exporter
Oct 07 21:42:23 compute-0 systemd[1]: Started podman_exporter container.
Oct 07 21:42:23 compute-0 podman[203153]: @ - - [07/Oct/2025:21:42:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16539 "" "Go-http-client/1.1"
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.031Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.032Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 07 21:42:23 compute-0 podman_exporter[203405]: ts=2025-10-07T21:42:23.033Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 07 21:42:23 compute-0 sudo[203339]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:23 compute-0 podman[203415]: 2025-10-07 21:42:23.125220765 +0000 UTC m=+0.096990658 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:42:23 compute-0 sudo[203588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edumwdmjthdtoqhokydjdilwegafydwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873343.2818878-1114-191990370842926/AnsiballZ_stat.py'
Oct 07 21:42:23 compute-0 sudo[203588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:23 compute-0 python3.9[203590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:42:23 compute-0 sudo[203588]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:24 compute-0 sudo[203711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urldbfwkrvmbyeabiilqccurnkudmpln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873343.2818878-1114-191990370842926/AnsiballZ_copy.py'
Oct 07 21:42:24 compute-0 sudo[203711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:24 compute-0 python3.9[203713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759873343.2818878-1114-191990370842926/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 07 21:42:24 compute-0 sudo[203711]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:25 compute-0 sudo[203863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irobwmbdxxykbdqmyumtvdbxmnyjcjwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873344.9543438-1148-130390118754218/AnsiballZ_container_config_data.py'
Oct 07 21:42:25 compute-0 sudo[203863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:25 compute-0 python3.9[203865]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct 07 21:42:25 compute-0 sudo[203863]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:42:25.584 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:42:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:42:25.584 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:42:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:42:25.584 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:42:26 compute-0 sudo[204016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjazudxcavjqpwsdyumukpwpoqwouhnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873345.8436394-1166-177068113581821/AnsiballZ_container_config_hash.py'
Oct 07 21:42:26 compute-0 sudo[204016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:26 compute-0 python3.9[204018]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 07 21:42:26 compute-0 sudo[204016]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:27 compute-0 sudo[204168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbrlbbnxqszxjzlojmlmhwqrzgakpxng ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873346.8114874-1186-173095031269817/AnsiballZ_edpm_container_manage.py'
Oct 07 21:42:27 compute-0 sudo[204168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:27 compute-0 python3[204170]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 07 21:42:29 compute-0 podman[204242]: 2025-10-07 21:42:29.931776355 +0000 UTC m=+0.202022713 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 21:42:30 compute-0 podman[204183]: 2025-10-07 21:42:30.083582085 +0000 UTC m=+2.557949123 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 07 21:42:30 compute-0 podman[204304]: 2025-10-07 21:42:30.307163718 +0000 UTC m=+0.078627085 container create c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 07 21:42:30 compute-0 podman[204304]: 2025-10-07 21:42:30.269598435 +0000 UTC m=+0.041061852 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 07 21:42:30 compute-0 python3[204170]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 07 21:42:30 compute-0 sudo[204168]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:31 compute-0 sudo[204492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxymkjhkunhyyndsdjnzeqltmaigcgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873350.8059864-1202-270577999613410/AnsiballZ_stat.py'
Oct 07 21:42:31 compute-0 sudo[204492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:31 compute-0 python3.9[204494]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:42:31 compute-0 sudo[204492]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:32 compute-0 sudo[204646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-queletyiqnwewhwcdibprqowbelxelmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873351.821415-1220-261022092348802/AnsiballZ_file.py'
Oct 07 21:42:32 compute-0 sudo[204646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:32 compute-0 python3.9[204648]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:32 compute-0 sudo[204646]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:33 compute-0 sudo[204797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ochswahkdpyifdyqfqppcdnolsexrorf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873352.5000873-1220-183309683768905/AnsiballZ_copy.py'
Oct 07 21:42:33 compute-0 sudo[204797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:33 compute-0 python3.9[204799]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759873352.5000873-1220-183309683768905/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:33 compute-0 sudo[204797]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:33 compute-0 sudo[204873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iluqynpvlbgscnyyfhgenuvaqutfmrrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873352.5000873-1220-183309683768905/AnsiballZ_systemd.py'
Oct 07 21:42:33 compute-0 sudo[204873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:33 compute-0 python3.9[204875]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 07 21:42:33 compute-0 systemd[1]: Reloading.
Oct 07 21:42:34 compute-0 systemd-rc-local-generator[204902]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:42:34 compute-0 systemd-sysv-generator[204905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:42:34 compute-0 sudo[204873]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:34 compute-0 sudo[205000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-witpuwoxsbbidjgbyymrfukftbmzwexk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873352.5000873-1220-183309683768905/AnsiballZ_systemd.py'
Oct 07 21:42:34 compute-0 podman[204959]: 2025-10-07 21:42:34.600073677 +0000 UTC m=+0.071353621 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 07 21:42:34 compute-0 sudo[205000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:34 compute-0 python3.9[205004]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 07 21:42:34 compute-0 systemd[1]: Reloading.
Oct 07 21:42:35 compute-0 systemd-rc-local-generator[205036]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 07 21:42:35 compute-0 systemd-sysv-generator[205039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 07 21:42:35 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 07 21:42:35 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:42:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:35 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.
Oct 07 21:42:35 compute-0 podman[205046]: 2025-10-07 21:42:35.497689262 +0000 UTC m=+0.137323167 container init c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *bridge.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *coverage.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *datapath.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *iface.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *memory.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *ovnnorthd.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *ovn.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *ovsdbserver.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *pmd_perf.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *pmd_rxq.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: INFO    21:42:35 main.go:48: registering *vswitch.Collector
Oct 07 21:42:35 compute-0 openstack_network_exporter[205062]: NOTICE  21:42:35 main.go:76: listening on https://:9105/metrics
Oct 07 21:42:35 compute-0 podman[205046]: 2025-10-07 21:42:35.523123672 +0000 UTC m=+0.162757617 container start c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Oct 07 21:42:35 compute-0 podman[205046]: openstack_network_exporter
Oct 07 21:42:35 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 07 21:42:35 compute-0 sudo[205000]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:35 compute-0 podman[205072]: 2025-10-07 21:42:35.652532795 +0000 UTC m=+0.107930624 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Oct 07 21:42:36 compute-0 sudo[205241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opfendbzfzyfngpwogkbdhdpdfzfjqvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873355.7493992-1268-26888851522893/AnsiballZ_systemd.py'
Oct 07 21:42:36 compute-0 sudo[205241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:36 compute-0 python3.9[205243]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 07 21:42:36 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Oct 07 21:42:36 compute-0 systemd[1]: libpod-c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.scope: Deactivated successfully.
Oct 07 21:42:36 compute-0 podman[205247]: 2025-10-07 21:42:36.632207248 +0000 UTC m=+0.066751020 container died c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc.)
Oct 07 21:42:36 compute-0 systemd[1]: c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7-33fe4ddf15106f62.timer: Deactivated successfully.
Oct 07 21:42:36 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.
Oct 07 21:42:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7-userdata-shm.mount: Deactivated successfully.
Oct 07 21:42:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715-merged.mount: Deactivated successfully.
Oct 07 21:42:37 compute-0 podman[205247]: 2025-10-07 21:42:37.410091317 +0000 UTC m=+0.844635089 container cleanup c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Oct 07 21:42:37 compute-0 podman[205247]: openstack_network_exporter
Oct 07 21:42:37 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 07 21:42:37 compute-0 podman[205276]: openstack_network_exporter
Oct 07 21:42:37 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct 07 21:42:37 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Oct 07 21:42:37 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 07 21:42:37 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:42:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646fd519efafca1a2ac1b849d4072d5343cfb5a0edaf09a2dd6d60239de58715/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 07 21:42:37 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.
Oct 07 21:42:37 compute-0 podman[205289]: 2025-10-07 21:42:37.654350365 +0000 UTC m=+0.126531125 container init c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *bridge.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *coverage.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *datapath.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *iface.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *memory.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *ovnnorthd.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *ovn.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *ovsdbserver.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *pmd_perf.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *pmd_rxq.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: INFO    21:42:37 main.go:48: registering *vswitch.Collector
Oct 07 21:42:37 compute-0 openstack_network_exporter[205305]: NOTICE  21:42:37 main.go:76: listening on https://:9105/metrics
Oct 07 21:42:37 compute-0 podman[205289]: 2025-10-07 21:42:37.701963907 +0000 UTC m=+0.174144657 container start c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Oct 07 21:42:37 compute-0 podman[205289]: openstack_network_exporter
Oct 07 21:42:37 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 07 21:42:37 compute-0 sudo[205241]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:37 compute-0 podman[205315]: 2025-10-07 21:42:37.811813828 +0000 UTC m=+0.093068158 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 07 21:42:38 compute-0 sudo[205485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauxqzykgovlluebyqnhsnzjzpimqddg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873357.9652848-1284-35115560425894/AnsiballZ_find.py'
Oct 07 21:42:38 compute-0 sudo[205485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:38 compute-0 python3.9[205487]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 07 21:42:38 compute-0 sudo[205485]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:39 compute-0 sudo[205637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lskmmshwitnzjwtvkrrnmtvleyrbtzbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873359.0490785-1303-70633515986861/AnsiballZ_podman_container_info.py'
Oct 07 21:42:39 compute-0 sudo[205637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:39 compute-0 python3.9[205639]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct 07 21:42:40 compute-0 sudo[205637]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:40 compute-0 sudo[205802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuhjaswzikqxkeelhfpimojecuhwvyzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873360.331074-1311-12974858536760/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:40 compute-0 sudo[205802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:41 compute-0 python3.9[205804]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:41 compute-0 systemd[1]: Started libpod-conmon-0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec.scope.
Oct 07 21:42:41 compute-0 podman[205805]: 2025-10-07 21:42:41.294725864 +0000 UTC m=+0.108143550 container exec 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 21:42:41 compute-0 podman[205805]: 2025-10-07 21:42:41.332493813 +0000 UTC m=+0.145911529 container exec_died 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 07 21:42:41 compute-0 systemd[1]: libpod-conmon-0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec.scope: Deactivated successfully.
Oct 07 21:42:41 compute-0 sudo[205802]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:41 compute-0 sudo[205986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akhwhrzbgiwhbkjoxsxhhhihtdpcnort ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873361.629566-1319-138006828268573/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:41 compute-0 sudo[205986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:42 compute-0 python3.9[205988]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:42 compute-0 systemd[1]: Started libpod-conmon-0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec.scope.
Oct 07 21:42:42 compute-0 podman[205989]: 2025-10-07 21:42:42.313156077 +0000 UTC m=+0.082081531 container exec 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:42:42 compute-0 podman[205989]: 2025-10-07 21:42:42.347415398 +0000 UTC m=+0.116340822 container exec_died 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251007, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Oct 07 21:42:42 compute-0 systemd[1]: libpod-conmon-0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec.scope: Deactivated successfully.
Oct 07 21:42:42 compute-0 sudo[205986]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:42 compute-0 sudo[206168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tycgzuatucrbtqtapxokpjnzoadlgnnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873362.5748436-1327-266179335441061/AnsiballZ_file.py'
Oct 07 21:42:42 compute-0 sudo[206168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:43 compute-0 python3.9[206170]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:43 compute-0 sudo[206168]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:43 compute-0 sudo[206320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdurcwitsjpjbkdvbtmbqybaszktkuig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873363.2824135-1336-250644498994178/AnsiballZ_podman_container_info.py'
Oct 07 21:42:43 compute-0 sudo[206320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:43 compute-0 python3.9[206322]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct 07 21:42:43 compute-0 sudo[206320]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:44 compute-0 sudo[206486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvczylivjzrbqqfotupfqzlsqrgqwzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873364.0326047-1344-266991588828173/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:44 compute-0 sudo[206486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:44 compute-0 python3.9[206488]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:44 compute-0 systemd[1]: Started libpod-conmon-c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675.scope.
Oct 07 21:42:44 compute-0 podman[206489]: 2025-10-07 21:42:44.641402328 +0000 UTC m=+0.074091536 container exec c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:42:44 compute-0 podman[206489]: 2025-10-07 21:42:44.676428603 +0000 UTC m=+0.109117751 container exec_died c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 07 21:42:44 compute-0 systemd[1]: libpod-conmon-c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675.scope: Deactivated successfully.
Oct 07 21:42:44 compute-0 sudo[206486]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:45 compute-0 sudo[206669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydwxgdwijmquagcmljfplsqnspsjhqes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873364.921434-1352-280237815940647/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:45 compute-0 sudo[206669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:45 compute-0 python3.9[206671]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:45 compute-0 systemd[1]: Started libpod-conmon-c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675.scope.
Oct 07 21:42:45 compute-0 podman[206672]: 2025-10-07 21:42:45.592999818 +0000 UTC m=+0.094461571 container exec c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:42:45 compute-0 podman[206672]: 2025-10-07 21:42:45.602369966 +0000 UTC m=+0.103831719 container exec_died c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 07 21:42:45 compute-0 sudo[206669]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:45 compute-0 systemd[1]: libpod-conmon-c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675.scope: Deactivated successfully.
Oct 07 21:42:46 compute-0 sudo[206852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtehxjkgfamxmwtgoyronhzlykxyqieh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873365.8099172-1360-9428956025833/AnsiballZ_file.py'
Oct 07 21:42:46 compute-0 sudo[206852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:46 compute-0 python3.9[206854]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:46 compute-0 sudo[206852]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:47 compute-0 sudo[207004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvzvlrsrgcjlsuyvhrjuesfdknsdzdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873366.6757514-1369-243154284562986/AnsiballZ_podman_container_info.py'
Oct 07 21:42:47 compute-0 sudo[207004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:47 compute-0 python3.9[207006]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct 07 21:42:47 compute-0 sudo[207004]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:47 compute-0 sudo[207170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orzpsonuvwnwbermprepaloetqvhbpqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873367.5546598-1377-177382104761629/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:47 compute-0 sudo[207170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:48 compute-0 python3.9[207172]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:48 compute-0 systemd[1]: Started libpod-conmon-bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71.scope.
Oct 07 21:42:48 compute-0 podman[207173]: 2025-10-07 21:42:48.261561276 +0000 UTC m=+0.106864692 container exec bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 07 21:42:48 compute-0 podman[207173]: 2025-10-07 21:42:48.296441916 +0000 UTC m=+0.141745342 container exec_died bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, tcib_managed=true)
Oct 07 21:42:48 compute-0 systemd[1]: libpod-conmon-bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71.scope: Deactivated successfully.
Oct 07 21:42:48 compute-0 sudo[207170]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:48 compute-0 sudo[207353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kydblvguyymitoxsdebarkxrkhqvwqnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873368.558016-1385-152694000778484/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:48 compute-0 sudo[207353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:49 compute-0 python3.9[207355]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:49 compute-0 systemd[1]: Started libpod-conmon-bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71.scope.
Oct 07 21:42:49 compute-0 podman[207356]: 2025-10-07 21:42:49.37028884 +0000 UTC m=+0.098696531 container exec bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 21:42:49 compute-0 podman[207356]: 2025-10-07 21:42:49.400379343 +0000 UTC m=+0.128787014 container exec_died bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:42:49 compute-0 systemd[1]: libpod-conmon-bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71.scope: Deactivated successfully.
Oct 07 21:42:49 compute-0 sudo[207353]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:49 compute-0 podman[207385]: 2025-10-07 21:42:49.521214122 +0000 UTC m=+0.057245968 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 21:42:49 compute-0 podman[207386]: 2025-10-07 21:42:49.546988634 +0000 UTC m=+0.072819237 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 21:42:49 compute-0 sudo[207572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwxtxfkthdafbeifyribbomuhcdcnzif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873369.5968714-1393-124299917315809/AnsiballZ_file.py'
Oct 07 21:42:49 compute-0 sudo[207572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:50 compute-0 python3.9[207574]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:50 compute-0 sudo[207572]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:50 compute-0 unix_chkpwd[207651]: password check failed for user (sshd)
Oct 07 21:42:50 compute-0 sshd-session[207299]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=sshd
Oct 07 21:42:50 compute-0 sudo[207725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkweavgevsarebnqvhyxfzbvydqxvytq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873370.4001117-1402-273957369876854/AnsiballZ_podman_container_info.py'
Oct 07 21:42:50 compute-0 sudo[207725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:50 compute-0 python3.9[207727]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct 07 21:42:51 compute-0 sudo[207725]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:51 compute-0 sudo[207890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwybhimoejpibemzytalotslttrjamgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873371.3713007-1410-247263261076429/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:51 compute-0 sudo[207890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:51 compute-0 python3.9[207892]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:52 compute-0 systemd[1]: Started libpod-conmon-c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.scope.
Oct 07 21:42:52 compute-0 podman[207893]: 2025-10-07 21:42:52.102770198 +0000 UTC m=+0.091255811 container exec c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 07 21:42:52 compute-0 podman[207893]: 2025-10-07 21:42:52.138420693 +0000 UTC m=+0.126906336 container exec_died c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 21:42:52 compute-0 sudo[207890]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:52 compute-0 systemd[1]: libpod-conmon-c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.scope: Deactivated successfully.
Oct 07 21:42:52 compute-0 sshd-session[207299]: Failed password for sshd from 116.110.151.5 port 40682 ssh2
Oct 07 21:42:52 compute-0 sudo[208075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpcrwczpcpaxdrcjgnjmavsjmkaxszev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873372.4008207-1418-20601373988317/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:52 compute-0 sudo[208075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:52 compute-0 python3.9[208077]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:53 compute-0 systemd[1]: Started libpod-conmon-c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.scope.
Oct 07 21:42:53 compute-0 podman[208078]: 2025-10-07 21:42:53.100185136 +0000 UTC m=+0.104768357 container exec c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd)
Oct 07 21:42:53 compute-0 podman[208078]: 2025-10-07 21:42:53.135375167 +0000 UTC m=+0.139958348 container exec_died c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_id=multipathd, container_name=multipathd)
Oct 07 21:42:53 compute-0 systemd[1]: libpod-conmon-c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2.scope: Deactivated successfully.
Oct 07 21:42:53 compute-0 sudo[208075]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:53 compute-0 podman[208108]: 2025-10-07 21:42:53.317065904 +0000 UTC m=+0.091700106 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:42:53 compute-0 sudo[208281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctmoxflalkmuxgkssrgjskgrgefcqpbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873373.4269574-1426-268402866401667/AnsiballZ_file.py'
Oct 07 21:42:53 compute-0 sudo[208281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:54 compute-0 python3.9[208283]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:54 compute-0 sudo[208281]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:54 compute-0 sudo[208433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvpslaslduynlirsnnrmdnrvkrsxjmuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873374.480477-1435-42167090323802/AnsiballZ_podman_container_info.py'
Oct 07 21:42:54 compute-0 sudo[208433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:55 compute-0 python3.9[208435]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct 07 21:42:55 compute-0 sudo[208433]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:55 compute-0 sudo[208599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aibzwpnwvuegsawtxfueeagwktswglat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873375.4199142-1443-187222796376472/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:55 compute-0 sudo[208599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:56 compute-0 python3.9[208601]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:56 compute-0 systemd[1]: Started libpod-conmon-9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.scope.
Oct 07 21:42:56 compute-0 podman[208602]: 2025-10-07 21:42:56.139785923 +0000 UTC m=+0.097766342 container exec 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:42:56 compute-0 podman[208602]: 2025-10-07 21:42:56.171234318 +0000 UTC m=+0.129214767 container exec_died 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:42:56 compute-0 sshd-session[207299]: Connection closed by authenticating user sshd 116.110.151.5 port 40682 [preauth]
Oct 07 21:42:56 compute-0 systemd[1]: libpod-conmon-9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.scope: Deactivated successfully.
Oct 07 21:42:56 compute-0 sudo[208599]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:56 compute-0 sudo[208782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxllphtaqyyhoegrbzihtvvztnfecqjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873376.413063-1451-218497748060603/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:56 compute-0 sudo[208782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:57 compute-0 python3.9[208784]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:57 compute-0 systemd[1]: Started libpod-conmon-9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.scope.
Oct 07 21:42:57 compute-0 podman[208785]: 2025-10-07 21:42:57.155056349 +0000 UTC m=+0.099237437 container exec 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:42:57 compute-0 podman[208785]: 2025-10-07 21:42:57.190492317 +0000 UTC m=+0.134673345 container exec_died 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:42:57 compute-0 systemd[1]: libpod-conmon-9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb.scope: Deactivated successfully.
Oct 07 21:42:57 compute-0 sudo[208782]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:57 compute-0 sudo[208967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrstsrlsmtpakotuiivmyyujznibdmzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873377.4803066-1459-262548953497411/AnsiballZ_file.py'
Oct 07 21:42:57 compute-0 sudo[208967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:58 compute-0 python3.9[208969]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:42:58 compute-0 sudo[208967]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:58 compute-0 sudo[209119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihootaveagveonogkzkuhskwlfggdqvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873378.434933-1468-259844213222257/AnsiballZ_podman_container_info.py'
Oct 07 21:42:58 compute-0 sudo[209119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:58 compute-0 python3.9[209121]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct 07 21:42:59 compute-0 sudo[209119]: pam_unix(sudo:session): session closed for user root
Oct 07 21:42:59 compute-0 sudo[209284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbthqkjvilqqxobyemioesqgaetvterh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873379.301383-1476-173041329970160/AnsiballZ_podman_container_exec.py'
Oct 07 21:42:59 compute-0 sudo[209284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:42:59 compute-0 python3.9[209286]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:42:59 compute-0 systemd[1]: Started libpod-conmon-c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.scope.
Oct 07 21:42:59 compute-0 podman[209287]: 2025-10-07 21:42:59.980107279 +0000 UTC m=+0.077146169 container exec c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 07 21:42:59 compute-0 podman[209287]: 2025-10-07 21:42:59.992274613 +0000 UTC m=+0.089313493 container exec_died c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Oct 07 21:43:00 compute-0 sudo[209284]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:00 compute-0 systemd[1]: libpod-conmon-c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.scope: Deactivated successfully.
Oct 07 21:43:00 compute-0 podman[209322]: 2025-10-07 21:43:00.255930496 +0000 UTC m=+0.148815709 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 07 21:43:00 compute-0 sudo[209496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fscbdorwabahujtgelzggxsxjitiyuol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873380.225616-1484-127140879605199/AnsiballZ_podman_container_exec.py'
Oct 07 21:43:00 compute-0 sudo[209496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:00 compute-0 python3.9[209498]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 07 21:43:00 compute-0 systemd[1]: Started libpod-conmon-c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.scope.
Oct 07 21:43:01 compute-0 podman[209499]: 2025-10-07 21:43:01.010917852 +0000 UTC m=+0.108331646 container exec c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct 07 21:43:01 compute-0 podman[209499]: 2025-10-07 21:43:01.020250399 +0000 UTC m=+0.117664133 container exec_died c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 07 21:43:01 compute-0 sudo[209496]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:01 compute-0 systemd[1]: libpod-conmon-c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7.scope: Deactivated successfully.
Oct 07 21:43:01 compute-0 sudo[209681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygczkgammblwnwbopwbsnlbsjxqifnpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873381.3009124-1492-138682616063148/AnsiballZ_file.py'
Oct 07 21:43:01 compute-0 sudo[209681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:01 compute-0 python3.9[209683]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:01 compute-0 sudo[209681]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:04 compute-0 podman[209708]: 2025-10-07 21:43:04.864496645 +0000 UTC m=+0.090108457 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:43:08 compute-0 podman[209729]: 2025-10-07 21:43:08.833896084 +0000 UTC m=+0.072567528 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Oct 07 21:43:12 compute-0 nova_compute[192716]: 2025-10-07 21:43:12.880 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:12 compute-0 nova_compute[192716]: 2025-10-07 21:43:12.881 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.391 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.393 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.393 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.393 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.906 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.907 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.907 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:43:13 compute-0 nova_compute[192716]: 2025-10-07 21:43:13.907 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:43:14 compute-0 nova_compute[192716]: 2025-10-07 21:43:14.087 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:43:14 compute-0 nova_compute[192716]: 2025-10-07 21:43:14.088 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:43:14 compute-0 nova_compute[192716]: 2025-10-07 21:43:14.127 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:43:14 compute-0 nova_compute[192716]: 2025-10-07 21:43:14.128 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6089MB free_disk=73.34268951416016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:43:14 compute-0 nova_compute[192716]: 2025-10-07 21:43:14.128 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:43:14 compute-0 nova_compute[192716]: 2025-10-07 21:43:14.129 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:43:15 compute-0 nova_compute[192716]: 2025-10-07 21:43:15.228 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:43:15 compute-0 nova_compute[192716]: 2025-10-07 21:43:15.228 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:43:14 up 52 min,  0 user,  load average: 0.58, 0.77, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:43:15 compute-0 nova_compute[192716]: 2025-10-07 21:43:15.251 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:43:15 compute-0 nova_compute[192716]: 2025-10-07 21:43:15.759 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:43:16 compute-0 nova_compute[192716]: 2025-10-07 21:43:16.271 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:43:16 compute-0 nova_compute[192716]: 2025-10-07 21:43:16.272 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.144s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:43:19 compute-0 podman[209752]: 2025-10-07 21:43:19.850600684 +0000 UTC m=+0.084035671 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 21:43:19 compute-0 podman[209753]: 2025-10-07 21:43:19.869626048 +0000 UTC m=+0.097682209 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 21:43:23 compute-0 podman[209790]: 2025-10-07 21:43:23.841987388 +0000 UTC m=+0.079874353 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:43:24 compute-0 sudo[209939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfveaglsqgvnazljcdyimigunqoslznx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873404.5047617-1700-93696786003231/AnsiballZ_file.py'
Oct 07 21:43:24 compute-0 sudo[209939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:25 compute-0 python3.9[209941]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:25 compute-0 sudo[209939]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:43:25.585 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:43:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:43:25.586 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:43:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:43:25.586 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:43:25 compute-0 sudo[210092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvreskiojmqejbxhjcvsnovossrnaict ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873405.2489922-1716-77171968016777/AnsiballZ_stat.py'
Oct 07 21:43:25 compute-0 sudo[210092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:25 compute-0 python3.9[210094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:25 compute-0 sudo[210092]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:26 compute-0 sudo[210215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqjsnxeywvatnkaqgcqruwzoajcapjfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873405.2489922-1716-77171968016777/AnsiballZ_copy.py'
Oct 07 21:43:26 compute-0 sudo[210215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:26 compute-0 python3.9[210217]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759873405.2489922-1716-77171968016777/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:26 compute-0 sudo[210215]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:27 compute-0 sudo[210367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lohoqfykwvrkflwhmmdnuzfxohuphlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873407.1073246-1748-91855138955994/AnsiballZ_file.py'
Oct 07 21:43:27 compute-0 sudo[210367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:27 compute-0 python3.9[210369]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:27 compute-0 sudo[210367]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:28 compute-0 sudo[210519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kllgfuuijewmwvkrtubuhepcisrcanqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873407.8929362-1764-18402316492604/AnsiballZ_stat.py'
Oct 07 21:43:28 compute-0 sudo[210519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:28 compute-0 python3.9[210521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:28 compute-0 sudo[210519]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:28 compute-0 sudo[210597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haanrbhxzfrzwtejckyqzwjhzelzoufr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873407.8929362-1764-18402316492604/AnsiballZ_file.py'
Oct 07 21:43:28 compute-0 sudo[210597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:29 compute-0 python3.9[210599]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:29 compute-0 sudo[210597]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:29 compute-0 sudo[210749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unclasuoqusoojfjdpirfgqyzicmkiyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873409.3237088-1788-220271412647194/AnsiballZ_stat.py'
Oct 07 21:43:29 compute-0 sudo[210749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:29 compute-0 python3.9[210751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:30 compute-0 sudo[210749]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:30 compute-0 sudo[210827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stnyemmpflsrqxwvfeaeaxluqyralqce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873409.3237088-1788-220271412647194/AnsiballZ_file.py'
Oct 07 21:43:30 compute-0 sudo[210827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:30 compute-0 python3.9[210829]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6xv59fkm recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:30 compute-0 sudo[210827]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:30 compute-0 podman[210890]: 2025-10-07 21:43:30.873128676 +0000 UTC m=+0.109592252 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:43:31 compute-0 sudo[211006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jllddcqrdtnckbvaevjolclpcewlencx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873410.6973665-1812-250034359109490/AnsiballZ_stat.py'
Oct 07 21:43:31 compute-0 sudo[211006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:31 compute-0 python3.9[211008]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:31 compute-0 sudo[211006]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:31 compute-0 sudo[211084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnkdhmcnurninvyiqnduflrjuyfmfoei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873410.6973665-1812-250034359109490/AnsiballZ_file.py'
Oct 07 21:43:31 compute-0 sudo[211084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:31 compute-0 python3.9[211086]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:31 compute-0 sudo[211084]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:32 compute-0 sudo[211236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imakaushfaefmsacqiqfjyycfgpuzmjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873412.1061513-1838-133282012707677/AnsiballZ_command.py'
Oct 07 21:43:32 compute-0 sudo[211236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:32 compute-0 python3.9[211238]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:43:32 compute-0 sudo[211236]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:33 compute-0 sudo[211389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqkqbiovqqshlimtipmlrxdoaxgsvdky ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759873413.000344-1854-140012293090813/AnsiballZ_edpm_nftables_from_files.py'
Oct 07 21:43:33 compute-0 sudo[211389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:33 compute-0 python3[211391]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 07 21:43:33 compute-0 sudo[211389]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:34 compute-0 sudo[211541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rilgdxwzlcblxxuzthaajuuqlwrqsxln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873414.0776103-1870-75240175379809/AnsiballZ_stat.py'
Oct 07 21:43:34 compute-0 sudo[211541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:34 compute-0 python3.9[211543]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:34 compute-0 sudo[211541]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:35 compute-0 sudo[211630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjrlfysefkvgzvgyobnwdupgwjvwfdie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873414.0776103-1870-75240175379809/AnsiballZ_file.py'
Oct 07 21:43:35 compute-0 sudo[211630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:35 compute-0 podman[211593]: 2025-10-07 21:43:35.157272578 +0000 UTC m=+0.119057097 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest)
Oct 07 21:43:35 compute-0 python3.9[211636]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:35 compute-0 sudo[211630]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:36 compute-0 sudo[211792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyzrpqllotooyloxpqnlpwiriqwqfyyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873415.581407-1894-7869902991400/AnsiballZ_stat.py'
Oct 07 21:43:36 compute-0 sudo[211792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:36 compute-0 python3.9[211794]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:36 compute-0 sudo[211792]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:36 compute-0 sudo[211870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdkdhfmkjgtbcbovnlnwcyhdbnhbejsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873415.581407-1894-7869902991400/AnsiballZ_file.py'
Oct 07 21:43:36 compute-0 sudo[211870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:36 compute-0 unix_chkpwd[211873]: password check failed for user (root)
Oct 07 21:43:36 compute-0 sshd-session[211641]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:43:36 compute-0 python3.9[211872]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:36 compute-0 sudo[211870]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:37 compute-0 sudo[212023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyevuhscpswurjhxbioibghiwkwsukmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873417.100965-1918-55355360399975/AnsiballZ_stat.py'
Oct 07 21:43:37 compute-0 sudo[212023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:37 compute-0 python3.9[212025]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:37 compute-0 sudo[212023]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:38 compute-0 sudo[212101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wogxlkjhyczoaahrnoqhtbkginklltpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873417.100965-1918-55355360399975/AnsiballZ_file.py'
Oct 07 21:43:38 compute-0 sudo[212101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:38 compute-0 python3.9[212103]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:38 compute-0 sudo[212101]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:38 compute-0 sshd-session[211641]: Failed password for root from 116.110.151.5 port 32888 ssh2
Oct 07 21:43:39 compute-0 sudo[212264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tftvmfezmyihhikqclmhugvmzvrnygbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873418.5239072-1942-234171602860588/AnsiballZ_stat.py'
Oct 07 21:43:39 compute-0 sudo[212264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:39 compute-0 podman[212227]: 2025-10-07 21:43:39.033913874 +0000 UTC m=+0.095749837 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Oct 07 21:43:39 compute-0 python3.9[212272]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:39 compute-0 sudo[212264]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:39 compute-0 sudo[212352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejgfyrtxgcnsbiurquqijjifwoamxvxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873418.5239072-1942-234171602860588/AnsiballZ_file.py'
Oct 07 21:43:39 compute-0 sudo[212352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:39 compute-0 python3.9[212354]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:39 compute-0 sudo[212352]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:40 compute-0 sudo[212504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijvgvyzzegqxvthexxtfdodclmeyzrhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873420.0343084-1966-255233359435297/AnsiballZ_stat.py'
Oct 07 21:43:40 compute-0 sudo[212504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:40 compute-0 python3.9[212506]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 07 21:43:40 compute-0 sudo[212504]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:41 compute-0 sshd-session[211641]: Connection closed by authenticating user root 116.110.151.5 port 32888 [preauth]
Oct 07 21:43:41 compute-0 sudo[212629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzmdrqgcyhckuoshatucdvnnbdojqson ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873420.0343084-1966-255233359435297/AnsiballZ_copy.py'
Oct 07 21:43:41 compute-0 sudo[212629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:41 compute-0 python3.9[212631]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759873420.0343084-1966-255233359435297/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:41 compute-0 sudo[212629]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:42 compute-0 sudo[212781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buccknfnoxnmvbkraijvklfvnabhksct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873421.763422-1996-216863526888656/AnsiballZ_file.py'
Oct 07 21:43:42 compute-0 sudo[212781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:42 compute-0 python3.9[212783]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:42 compute-0 sudo[212781]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:42 compute-0 sudo[212933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cebzfnvtaywelhmzwimsijcdbjjyhtmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873422.6483111-2012-167101501149197/AnsiballZ_command.py'
Oct 07 21:43:42 compute-0 sudo[212933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:43 compute-0 python3.9[212935]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:43:43 compute-0 sudo[212933]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:44 compute-0 sudo[213088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiafgzhbzsupbyjzbzaueurmiktjutqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873423.4974535-2028-47733194815421/AnsiballZ_blockinfile.py'
Oct 07 21:43:44 compute-0 sudo[213088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:44 compute-0 python3.9[213090]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:44 compute-0 sudo[213088]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:44 compute-0 sudo[213240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npisuykvxtezdxugqaszrjpoyulzclzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873424.6308398-2046-258730432087984/AnsiballZ_command.py'
Oct 07 21:43:44 compute-0 sudo[213240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:45 compute-0 python3.9[213242]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:43:45 compute-0 sudo[213240]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:45 compute-0 sudo[213393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmndtukecdqlpaqqlsvddzevqfjjnosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873425.4552665-2062-241110554278078/AnsiballZ_stat.py'
Oct 07 21:43:45 compute-0 sudo[213393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:45 compute-0 python3.9[213395]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 07 21:43:46 compute-0 sudo[213393]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:46 compute-0 sudo[213547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvardgscorkasjwhcjfifygginrcjymk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873426.310568-2078-251565034540363/AnsiballZ_command.py'
Oct 07 21:43:46 compute-0 sudo[213547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:46 compute-0 python3.9[213549]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 07 21:43:46 compute-0 sudo[213547]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:47 compute-0 sudo[213702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lavpltfuvzakreajabcpixpltgprtjov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759873427.202003-2094-83405911805404/AnsiballZ_file.py'
Oct 07 21:43:47 compute-0 sudo[213702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 21:43:47 compute-0 python3.9[213704]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 07 21:43:47 compute-0 sudo[213702]: pam_unix(sudo:session): session closed for user root
Oct 07 21:43:48 compute-0 sshd-session[193039]: Connection closed by 192.168.122.30 port 46074
Oct 07 21:43:48 compute-0 sshd-session[193036]: pam_unix(sshd:session): session closed for user zuul
Oct 07 21:43:48 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Oct 07 21:43:48 compute-0 systemd[1]: session-28.scope: Consumed 1min 38.801s CPU time.
Oct 07 21:43:48 compute-0 systemd-logind[798]: Session 28 logged out. Waiting for processes to exit.
Oct 07 21:43:48 compute-0 systemd-logind[798]: Removed session 28.
Oct 07 21:43:50 compute-0 podman[213729]: 2025-10-07 21:43:50.855155881 +0000 UTC m=+0.080626978 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 07 21:43:50 compute-0 podman[213730]: 2025-10-07 21:43:50.861436301 +0000 UTC m=+0.091389449 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4)
Oct 07 21:43:54 compute-0 sshd-session[213770]: Invalid user github from 103.115.24.11 port 44254
Oct 07 21:43:54 compute-0 sshd-session[213770]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:43:54 compute-0 sshd-session[213770]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 21:43:54 compute-0 podman[213772]: 2025-10-07 21:43:54.159477019 +0000 UTC m=+0.054095452 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:43:55 compute-0 sshd-session[213770]: Failed password for invalid user github from 103.115.24.11 port 44254 ssh2
Oct 07 21:43:56 compute-0 sshd-session[213770]: Received disconnect from 103.115.24.11 port 44254:11: Bye Bye [preauth]
Oct 07 21:43:56 compute-0 sshd-session[213770]: Disconnected from invalid user github 103.115.24.11 port 44254 [preauth]
Oct 07 21:43:59 compute-0 podman[203153]: time="2025-10-07T21:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:43:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:43:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2973 "" "Go-http-client/1.1"
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: ERROR   21:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: ERROR   21:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: ERROR   21:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: ERROR   21:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: ERROR   21:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:44:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:44:01 compute-0 podman[213807]: 2025-10-07 21:44:01.895151326 +0000 UTC m=+0.121412840 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:44:05 compute-0 podman[213831]: 2025-10-07 21:44:05.854557158 +0000 UTC m=+0.080064793 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 21:44:09 compute-0 sshd-session[213850]: Invalid user oracle from 116.110.151.5 port 54600
Oct 07 21:44:09 compute-0 podman[213852]: 2025-10-07 21:44:09.839222832 +0000 UTC m=+0.080213828 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, version=9.6)
Oct 07 21:44:10 compute-0 sshd-session[213850]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:44:10 compute-0 sshd-session[213850]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:44:12 compute-0 sshd-session[213850]: Failed password for invalid user oracle from 116.110.151.5 port 54600 ssh2
Oct 07 21:44:14 compute-0 sshd-session[213850]: Connection closed by invalid user oracle 116.110.151.5 port 54600 [preauth]
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.274 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.275 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.275 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.276 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.276 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.276 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.277 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.277 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.277 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.793 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.794 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.794 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:44:16 compute-0 nova_compute[192716]: 2025-10-07 21:44:16.794 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:44:17 compute-0 nova_compute[192716]: 2025-10-07 21:44:17.023 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:44:17 compute-0 nova_compute[192716]: 2025-10-07 21:44:17.025 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:44:17 compute-0 nova_compute[192716]: 2025-10-07 21:44:17.050 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:44:17 compute-0 nova_compute[192716]: 2025-10-07 21:44:17.051 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6121MB free_disk=73.34170532226562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:44:17 compute-0 nova_compute[192716]: 2025-10-07 21:44:17.052 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:44:17 compute-0 nova_compute[192716]: 2025-10-07 21:44:17.053 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:44:18 compute-0 nova_compute[192716]: 2025-10-07 21:44:18.121 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:44:18 compute-0 nova_compute[192716]: 2025-10-07 21:44:18.122 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:44:17 up 53 min,  0 user,  load average: 0.39, 0.69, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:44:18 compute-0 nova_compute[192716]: 2025-10-07 21:44:18.145 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:44:18 compute-0 nova_compute[192716]: 2025-10-07 21:44:18.653 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:44:19 compute-0 nova_compute[192716]: 2025-10-07 21:44:19.166 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:44:19 compute-0 nova_compute[192716]: 2025-10-07 21:44:19.167 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:44:21 compute-0 podman[213875]: 2025-10-07 21:44:21.862522156 +0000 UTC m=+0.085906321 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4)
Oct 07 21:44:21 compute-0 podman[213876]: 2025-10-07 21:44:21.862497426 +0000 UTC m=+0.082316805 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 07 21:44:24 compute-0 podman[213915]: 2025-10-07 21:44:24.848105277 +0000 UTC m=+0.081331708 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:44:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:44:25.587 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:44:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:44:25.588 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:44:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:44:25.588 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:44:32 compute-0 unix_chkpwd[213949]: password check failed for user (root)
Oct 07 21:44:32 compute-0 sshd-session[213941]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 07 21:44:32 compute-0 podman[213943]: 2025-10-07 21:44:32.901322429 +0000 UTC m=+0.130306150 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 21:44:35 compute-0 sshd-session[213941]: Failed password for root from 193.46.255.7 port 36182 ssh2
Oct 07 21:44:36 compute-0 sshd-session[213970]: Invalid user rebecca from 116.110.151.5 port 42150
Oct 07 21:44:36 compute-0 podman[213972]: 2025-10-07 21:44:36.630397563 +0000 UTC m=+0.060982079 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:44:36 compute-0 unix_chkpwd[213991]: password check failed for user (root)
Oct 07 21:44:37 compute-0 sshd-session[213970]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:44:37 compute-0 sshd-session[213970]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:44:38 compute-0 PackageKit[130197]: daemon quit
Oct 07 21:44:38 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 07 21:44:38 compute-0 sshd-session[213941]: Failed password for root from 193.46.255.7 port 36182 ssh2
Oct 07 21:44:38 compute-0 sshd-session[213970]: Failed password for invalid user rebecca from 116.110.151.5 port 42150 ssh2
Oct 07 21:44:39 compute-0 unix_chkpwd[213992]: password check failed for user (root)
Oct 07 21:44:39 compute-0 sshd-session[213970]: Connection closed by invalid user rebecca 116.110.151.5 port 42150 [preauth]
Oct 07 21:44:40 compute-0 podman[213993]: 2025-10-07 21:44:40.84236415 +0000 UTC m=+0.075427885 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal)
Oct 07 21:44:41 compute-0 sshd-session[213941]: Failed password for root from 193.46.255.7 port 36182 ssh2
Oct 07 21:44:43 compute-0 sshd-session[213941]: Received disconnect from 193.46.255.7 port 36182:11:  [preauth]
Oct 07 21:44:43 compute-0 sshd-session[213941]: Disconnected from authenticating user root 193.46.255.7 port 36182 [preauth]
Oct 07 21:44:43 compute-0 sshd-session[213941]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 07 21:44:44 compute-0 unix_chkpwd[214016]: password check failed for user (root)
Oct 07 21:44:44 compute-0 sshd-session[214014]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 07 21:44:46 compute-0 sshd-session[214014]: Failed password for root from 193.46.255.7 port 10532 ssh2
Oct 07 21:44:48 compute-0 unix_chkpwd[214017]: password check failed for user (root)
Oct 07 21:44:50 compute-0 sshd-session[214014]: Failed password for root from 193.46.255.7 port 10532 ssh2
Oct 07 21:44:50 compute-0 unix_chkpwd[214018]: password check failed for user (root)
Oct 07 21:44:52 compute-0 sshd-session[214014]: Failed password for root from 193.46.255.7 port 10532 ssh2
Oct 07 21:44:52 compute-0 sshd-session[214014]: Received disconnect from 193.46.255.7 port 10532:11:  [preauth]
Oct 07 21:44:52 compute-0 sshd-session[214014]: Disconnected from authenticating user root 193.46.255.7 port 10532 [preauth]
Oct 07 21:44:52 compute-0 sshd-session[214014]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 07 21:44:52 compute-0 podman[214020]: 2025-10-07 21:44:52.825412346 +0000 UTC m=+0.059221579 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:44:52 compute-0 podman[214019]: 2025-10-07 21:44:52.842757824 +0000 UTC m=+0.074195770 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 21:44:53 compute-0 unix_chkpwd[214060]: password check failed for user (root)
Oct 07 21:44:53 compute-0 sshd-session[214058]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 07 21:44:55 compute-0 sshd-session[214058]: Failed password for root from 193.46.255.7 port 57110 ssh2
Oct 07 21:44:55 compute-0 unix_chkpwd[214062]: password check failed for user (root)
Oct 07 21:44:55 compute-0 podman[214061]: 2025-10-07 21:44:55.868426715 +0000 UTC m=+0.103329331 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:44:57 compute-0 sshd-session[214058]: Failed password for root from 193.46.255.7 port 57110 ssh2
Oct 07 21:44:58 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:44:58.811 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:44:58 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:44:58.811 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:44:58 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:44:58.813 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:44:59 compute-0 podman[203153]: time="2025-10-07T21:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:44:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:44:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Oct 07 21:44:59 compute-0 unix_chkpwd[214090]: password check failed for user (root)
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: ERROR   21:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: ERROR   21:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: ERROR   21:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: ERROR   21:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: ERROR   21:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:45:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:45:01 compute-0 sshd-session[214058]: Failed password for root from 193.46.255.7 port 57110 ssh2
Oct 07 21:45:02 compute-0 sshd-session[214058]: Received disconnect from 193.46.255.7 port 57110:11:  [preauth]
Oct 07 21:45:02 compute-0 sshd-session[214058]: Disconnected from authenticating user root 193.46.255.7 port 57110 [preauth]
Oct 07 21:45:02 compute-0 sshd-session[214058]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.210 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:f1:e7 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-b680ebb8-b032-4152-b492-a47dfb9fbe73', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b680ebb8-b032-4152-b492-a47dfb9fbe73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '293ff4341f3d48a4ae100bf4fc7b99bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a492d3e-12fc-41f0-8269-d6b729ab678b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e86ec0a7-cfc9-422f-bcb4-7bfd72f11389) old=Port_Binding(mac=['fa:16:3e:4c:f1:e7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b680ebb8-b032-4152-b492-a47dfb9fbe73', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b680ebb8-b032-4152-b492-a47dfb9fbe73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '293ff4341f3d48a4ae100bf4fc7b99bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.211 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e86ec0a7-cfc9-422f-bcb4-7bfd72f11389 in datapath b680ebb8-b032-4152-b492-a47dfb9fbe73 updated
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.213 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b680ebb8-b032-4152-b492-a47dfb9fbe73, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.214 103791 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpg2iye9_g/privsep.sock']
Oct 07 21:45:03 compute-0 podman[214096]: 2025-10-07 21:45:03.877409914 +0000 UTC m=+0.111151062 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.997 103791 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.997 103791 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpg2iye9_g/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.827 214116 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.832 214116 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.834 214116 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.834 214116 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214116
Oct 07 21:45:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:03.999 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[327520c4-8738-4044-8e94-f7a130820f57]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:45:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:04.549 214116 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:45:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:04.549 214116 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:45:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:04.549 214116 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:45:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:05.006 214116 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 07 21:45:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:05.012 214116 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 07 21:45:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:05.045 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[031076e3-eb75-4bf8-b306-416630a029df]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:45:06 compute-0 podman[214130]: 2025-10-07 21:45:06.827177977 +0000 UTC m=+0.071966638 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 21:45:11 compute-0 podman[214149]: 2025-10-07 21:45:11.826244147 +0000 UTC m=+0.066922605 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=)
Oct 07 21:45:13 compute-0 unix_chkpwd[214170]: password check failed for user (root)
Oct 07 21:45:13 compute-0 sshd-session[214128]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:45:15 compute-0 sshd-session[214128]: Failed password for root from 116.110.151.5 port 54170 ssh2
Oct 07 21:45:17 compute-0 nova_compute[192716]: 2025-10-07 21:45:17.878 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:17 compute-0 nova_compute[192716]: 2025-10-07 21:45:17.879 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 sshd-session[214128]: Connection closed by authenticating user root 116.110.151.5 port 54170 [preauth]
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.391 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.392 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.393 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.393 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.393 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.917 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.917 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.918 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:45:18 compute-0 nova_compute[192716]: 2025-10-07 21:45:18.918 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:45:19 compute-0 nova_compute[192716]: 2025-10-07 21:45:19.142 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:45:19 compute-0 nova_compute[192716]: 2025-10-07 21:45:19.143 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:45:19 compute-0 nova_compute[192716]: 2025-10-07 21:45:19.166 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:45:19 compute-0 nova_compute[192716]: 2025-10-07 21:45:19.167 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6018MB free_disk=73.3410873413086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:45:19 compute-0 nova_compute[192716]: 2025-10-07 21:45:19.167 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:45:19 compute-0 nova_compute[192716]: 2025-10-07 21:45:19.168 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:45:20 compute-0 nova_compute[192716]: 2025-10-07 21:45:20.229 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:45:20 compute-0 nova_compute[192716]: 2025-10-07 21:45:20.229 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:45:19 up 54 min,  0 user,  load average: 0.16, 0.56, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:45:20 compute-0 nova_compute[192716]: 2025-10-07 21:45:20.253 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:45:20 compute-0 nova_compute[192716]: 2025-10-07 21:45:20.766 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:45:21 compute-0 nova_compute[192716]: 2025-10-07 21:45:21.275 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:45:21 compute-0 nova_compute[192716]: 2025-10-07 21:45:21.276 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:45:23 compute-0 podman[214173]: 2025-10-07 21:45:23.865845234 +0000 UTC m=+0.089595844 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 07 21:45:23 compute-0 podman[214172]: 2025-10-07 21:45:23.866235215 +0000 UTC m=+0.095280654 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:45:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:25.589 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:45:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:25.589 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:45:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:45:25.589 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:45:26 compute-0 podman[214215]: 2025-10-07 21:45:26.869584728 +0000 UTC m=+0.101056957 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:45:29 compute-0 podman[203153]: time="2025-10-07T21:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:45:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:45:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: ERROR   21:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: ERROR   21:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: ERROR   21:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: ERROR   21:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: ERROR   21:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:45:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:45:34 compute-0 podman[214239]: 2025-10-07 21:45:34.913631666 +0000 UTC m=+0.140389275 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251007)
Oct 07 21:45:37 compute-0 podman[214266]: 2025-10-07 21:45:37.837224142 +0000 UTC m=+0.072929774 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Oct 07 21:45:42 compute-0 podman[214285]: 2025-10-07 21:45:42.849337191 +0000 UTC m=+0.082858515 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Oct 07 21:45:49 compute-0 sshd-session[214309]: Invalid user test from 116.110.151.5 port 47836
Oct 07 21:45:49 compute-0 sshd-session[214307]: Invalid user admin from 116.110.151.5 port 47846
Oct 07 21:45:49 compute-0 sshd-session[214309]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:45:49 compute-0 sshd-session[214309]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:45:50 compute-0 sshd-session[214307]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:45:50 compute-0 sshd-session[214307]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:45:52 compute-0 sshd-session[214309]: Failed password for invalid user test from 116.110.151.5 port 47836 ssh2
Oct 07 21:45:52 compute-0 sshd-session[214307]: Failed password for invalid user admin from 116.110.151.5 port 47846 ssh2
Oct 07 21:45:53 compute-0 sshd-session[214309]: Connection closed by invalid user test 116.110.151.5 port 47836 [preauth]
Oct 07 21:45:54 compute-0 sshd-session[214307]: Connection closed by invalid user admin 116.110.151.5 port 47846 [preauth]
Oct 07 21:45:54 compute-0 podman[214311]: 2025-10-07 21:45:54.846731092 +0000 UTC m=+0.083658029 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251007, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 21:45:54 compute-0 podman[214312]: 2025-10-07 21:45:54.846818054 +0000 UTC m=+0.080535978 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest)
Oct 07 21:45:57 compute-0 podman[214352]: 2025-10-07 21:45:57.833695867 +0000 UTC m=+0.069422719 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:45:59 compute-0 podman[203153]: time="2025-10-07T21:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:45:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:45:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: ERROR   21:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: ERROR   21:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: ERROR   21:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: ERROR   21:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: ERROR   21:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:46:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:46:05 compute-0 podman[214377]: 2025-10-07 21:46:05.867797538 +0000 UTC m=+0.100573395 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:46:08 compute-0 podman[214403]: 2025-10-07 21:46:08.835977293 +0000 UTC m=+0.070467519 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:46:09 compute-0 nova_compute[192716]: 2025-10-07 21:46:09.995 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:09 compute-0 nova_compute[192716]: 2025-10-07 21:46:09.995 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 21:46:10 compute-0 nova_compute[192716]: 2025-10-07 21:46:10.502 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 21:46:10 compute-0 nova_compute[192716]: 2025-10-07 21:46:10.504 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:10 compute-0 nova_compute[192716]: 2025-10-07 21:46:10.504 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 21:46:11 compute-0 nova_compute[192716]: 2025-10-07 21:46:11.015 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:13 compute-0 nova_compute[192716]: 2025-10-07 21:46:13.514 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:13 compute-0 nova_compute[192716]: 2025-10-07 21:46:13.515 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:13 compute-0 nova_compute[192716]: 2025-10-07 21:46:13.515 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:13 compute-0 nova_compute[192716]: 2025-10-07 21:46:13.515 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:13 compute-0 nova_compute[192716]: 2025-10-07 21:46:13.516 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:46:13 compute-0 podman[214423]: 2025-10-07 21:46:13.855832484 +0000 UTC m=+0.090895647 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 21:46:14 compute-0 nova_compute[192716]: 2025-10-07 21:46:14.994 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:15 compute-0 nova_compute[192716]: 2025-10-07 21:46:15.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:15 compute-0 nova_compute[192716]: 2025-10-07 21:46:15.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:15 compute-0 nova_compute[192716]: 2025-10-07 21:46:15.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.506 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.707 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.709 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.732 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.733 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6077MB free_disk=73.34112167358398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.733 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:46:16 compute-0 nova_compute[192716]: 2025-10-07 21:46:16.734 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:46:17 compute-0 nova_compute[192716]: 2025-10-07 21:46:17.785 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:46:17 compute-0 nova_compute[192716]: 2025-10-07 21:46:17.785 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:46:16 up 55 min,  0 user,  load average: 0.06, 0.47, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:46:17 compute-0 nova_compute[192716]: 2025-10-07 21:46:17.803 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:46:18 compute-0 nova_compute[192716]: 2025-10-07 21:46:18.311 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:46:18 compute-0 nova_compute[192716]: 2025-10-07 21:46:18.822 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:46:18 compute-0 nova_compute[192716]: 2025-10-07 21:46:18.823 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:46:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:46:25.591 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:46:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:46:25.591 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:46:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:46:25.592 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:46:25 compute-0 podman[214447]: 2025-10-07 21:46:25.824089877 +0000 UTC m=+0.062898761 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 07 21:46:25 compute-0 podman[214446]: 2025-10-07 21:46:25.869769621 +0000 UTC m=+0.105521867 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=iscsid)
Oct 07 21:46:28 compute-0 podman[214484]: 2025-10-07 21:46:28.834422233 +0000 UTC m=+0.074528945 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:46:29 compute-0 podman[203153]: time="2025-10-07T21:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:46:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:46:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: ERROR   21:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: ERROR   21:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: ERROR   21:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: ERROR   21:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: ERROR   21:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:46:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:46:36 compute-0 podman[214510]: 2025-10-07 21:46:36.905577671 +0000 UTC m=+0.142210003 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:46:39 compute-0 podman[214536]: 2025-10-07 21:46:39.860519343 +0000 UTC m=+0.094652555 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:46:44 compute-0 podman[214556]: 2025-10-07 21:46:44.843908519 +0000 UTC m=+0.079700410 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 07 21:46:56 compute-0 podman[214576]: 2025-10-07 21:46:56.844178391 +0000 UTC m=+0.072592156 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:46:56 compute-0 podman[214577]: 2025-10-07 21:46:56.868982337 +0000 UTC m=+0.090585364 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 07 21:46:59 compute-0 podman[203153]: time="2025-10-07T21:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:46:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:46:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 07 21:46:59 compute-0 podman[214618]: 2025-10-07 21:46:59.829113189 +0000 UTC m=+0.066416107 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: ERROR   21:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: ERROR   21:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: ERROR   21:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: ERROR   21:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: ERROR   21:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:47:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:47:07 compute-0 podman[214642]: 2025-10-07 21:47:07.897599096 +0000 UTC m=+0.129981951 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:47:10 compute-0 podman[214668]: 2025-10-07 21:47:10.836773133 +0000 UTC m=+0.070771822 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 07 21:47:15 compute-0 nova_compute[192716]: 2025-10-07 21:47:15.822 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:15 compute-0 nova_compute[192716]: 2025-10-07 21:47:15.823 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:15 compute-0 nova_compute[192716]: 2025-10-07 21:47:15.823 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:15 compute-0 nova_compute[192716]: 2025-10-07 21:47:15.823 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:15 compute-0 nova_compute[192716]: 2025-10-07 21:47:15.823 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:47:15 compute-0 podman[214689]: 2025-10-07 21:47:15.841776006 +0000 UTC m=+0.073810021 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=)
Oct 07 21:47:15 compute-0 nova_compute[192716]: 2025-10-07 21:47:15.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:16 compute-0 nova_compute[192716]: 2025-10-07 21:47:16.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:16 compute-0 nova_compute[192716]: 2025-10-07 21:47:16.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.508 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.510 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.694 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.695 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.735 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.736 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6086MB free_disk=73.34110260009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.736 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:47:17 compute-0 nova_compute[192716]: 2025-10-07 21:47:17.737 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:47:18 compute-0 nova_compute[192716]: 2025-10-07 21:47:18.844 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:47:18 compute-0 nova_compute[192716]: 2025-10-07 21:47:18.844 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:47:17 up 56 min,  0 user,  load average: 0.02, 0.37, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:47:18 compute-0 nova_compute[192716]: 2025-10-07 21:47:18.942 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 21:47:19 compute-0 nova_compute[192716]: 2025-10-07 21:47:19.010 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 21:47:19 compute-0 nova_compute[192716]: 2025-10-07 21:47:19.011 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:47:19 compute-0 nova_compute[192716]: 2025-10-07 21:47:19.033 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 21:47:19 compute-0 nova_compute[192716]: 2025-10-07 21:47:19.072 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 21:47:19 compute-0 nova_compute[192716]: 2025-10-07 21:47:19.098 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:47:19 compute-0 nova_compute[192716]: 2025-10-07 21:47:19.609 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:47:20 compute-0 nova_compute[192716]: 2025-10-07 21:47:20.120 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:47:20 compute-0 nova_compute[192716]: 2025-10-07 21:47:20.121 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.384s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:47:21 compute-0 nova_compute[192716]: 2025-10-07 21:47:21.122 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:21 compute-0 nova_compute[192716]: 2025-10-07 21:47:21.633 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:47:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:47:25.593 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:47:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:47:25.594 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:47:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:47:25.594 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:47:27 compute-0 podman[214713]: 2025-10-07 21:47:27.857968753 +0000 UTC m=+0.083191061 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:47:27 compute-0 podman[214714]: 2025-10-07 21:47:27.858236 +0000 UTC m=+0.087377521 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Oct 07 21:47:29 compute-0 podman[203153]: time="2025-10-07T21:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:47:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:47:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 07 21:47:30 compute-0 podman[214753]: 2025-10-07 21:47:30.861949471 +0000 UTC m=+0.089714309 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: ERROR   21:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: ERROR   21:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: ERROR   21:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: ERROR   21:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: ERROR   21:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:47:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:47:37 compute-0 sshd-session[214777]: Invalid user guest1 from 116.110.151.5 port 49550
Oct 07 21:47:37 compute-0 sshd-session[214777]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:47:37 compute-0 sshd-session[214777]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:47:38 compute-0 podman[214781]: 2025-10-07 21:47:38.092283609 +0000 UTC m=+0.099668646 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 07 21:47:38 compute-0 sshd-session[214779]: Invalid user btf from 116.110.151.5 port 49558
Oct 07 21:47:39 compute-0 sshd-session[214779]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:47:39 compute-0 sshd-session[214779]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:47:39 compute-0 sshd-session[214777]: Failed password for invalid user guest1 from 116.110.151.5 port 49550 ssh2
Oct 07 21:47:41 compute-0 sshd-session[214777]: Connection closed by invalid user guest1 116.110.151.5 port 49550 [preauth]
Oct 07 21:47:41 compute-0 sshd-session[214779]: Failed password for invalid user btf from 116.110.151.5 port 49558 ssh2
Oct 07 21:47:41 compute-0 podman[214809]: 2025-10-07 21:47:41.851279817 +0000 UTC m=+0.084188430 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:47:41 compute-0 sshd-session[214779]: Connection closed by invalid user btf 116.110.151.5 port 49558 [preauth]
Oct 07 21:47:46 compute-0 podman[214828]: 2025-10-07 21:47:46.848937417 +0000 UTC m=+0.079394771 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 21:47:50 compute-0 unix_chkpwd[214853]: password check failed for user (root)
Oct 07 21:47:50 compute-0 sshd-session[214850]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:47:52 compute-0 sshd-session[214850]: Failed password for root from 116.110.151.5 port 52762 ssh2
Oct 07 21:47:54 compute-0 sshd-session[214850]: Connection closed by authenticating user root 116.110.151.5 port 52762 [preauth]
Oct 07 21:47:58 compute-0 podman[214854]: 2025-10-07 21:47:58.854099058 +0000 UTC m=+0.084450867 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid)
Oct 07 21:47:58 compute-0 podman[214855]: 2025-10-07 21:47:58.868710229 +0000 UTC m=+0.093176449 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:47:59 compute-0 podman[203153]: time="2025-10-07T21:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:47:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:47:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: ERROR   21:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: ERROR   21:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: ERROR   21:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: ERROR   21:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: ERROR   21:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:48:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:48:01 compute-0 podman[214892]: 2025-10-07 21:48:01.832196988 +0000 UTC m=+0.067608551 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 21:48:08 compute-0 podman[214917]: 2025-10-07 21:48:08.910787399 +0000 UTC m=+0.144264543 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 21:48:10 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:10.795 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:48:10 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:10.797 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:48:12 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:12.263 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:60:23 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-54682f8b-05cf-4aab-acc7-0e5c53f72985', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54682f8b-05cf-4aab-acc7-0e5c53f72985', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9ce7542f6c04b51ba0cd904de5aac34', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0257712e-9df4-4c42-adef-63ff75deb753, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c15df47c-a2ac-4365-a793-3e835627cc79) old=Port_Binding(mac=['fa:16:3e:1c:60:23'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-54682f8b-05cf-4aab-acc7-0e5c53f72985', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54682f8b-05cf-4aab-acc7-0e5c53f72985', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9ce7542f6c04b51ba0cd904de5aac34', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:48:12 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:12.266 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c15df47c-a2ac-4365-a793-3e835627cc79 in datapath 54682f8b-05cf-4aab-acc7-0e5c53f72985 updated
Oct 07 21:48:12 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:12.267 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54682f8b-05cf-4aab-acc7-0e5c53f72985, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:48:12 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:12.269 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8b95d096-72ac-4a8b-90e2-88c30f126186]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:48:12 compute-0 podman[214944]: 2025-10-07 21:48:12.366421685 +0000 UTC m=+0.070233677 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 21:48:12 compute-0 nova_compute[192716]: 2025-10-07 21:48:12.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:12 compute-0 nova_compute[192716]: 2025-10-07 21:48:12.992 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:48:14 compute-0 nova_compute[192716]: 2025-10-07 21:48:14.992 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:15 compute-0 nova_compute[192716]: 2025-10-07 21:48:15.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:16 compute-0 nova_compute[192716]: 2025-10-07 21:48:16.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:16 compute-0 nova_compute[192716]: 2025-10-07 21:48:16.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:17 compute-0 podman[214963]: 2025-10-07 21:48:17.890550184 +0000 UTC m=+0.119329774 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 21:48:17 compute-0 nova_compute[192716]: 2025-10-07 21:48:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:18 compute-0 nova_compute[192716]: 2025-10-07 21:48:18.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:18 compute-0 nova_compute[192716]: 2025-10-07 21:48:18.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.502 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:48:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:19.633 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:e9:bc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4b266c47-ddd6-41ce-8b85-080eb25200e5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b266c47-ddd6-41ce-8b85-080eb25200e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd66fbc6ffbdc4b6aaadd7e417ae8b453', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfd1fc9d-2af8-401d-86d1-b6647f165ead, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=116c8d8c-7ce6-453d-bba4-ea9dca1a327e) old=Port_Binding(mac=['fa:16:3e:40:e9:bc'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4b266c47-ddd6-41ce-8b85-080eb25200e5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b266c47-ddd6-41ce-8b85-080eb25200e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd66fbc6ffbdc4b6aaadd7e417ae8b453', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:48:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:19.634 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 116c8d8c-7ce6-453d-bba4-ea9dca1a327e in datapath 4b266c47-ddd6-41ce-8b85-080eb25200e5 updated
Oct 07 21:48:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:19.635 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b266c47-ddd6-41ce-8b85-080eb25200e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:48:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:19.636 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d64049cb-59bb-4581-861f-2250965d7838]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.716 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.718 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.745 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.746 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6097MB free_disk=73.34111785888672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.746 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:48:19 compute-0 nova_compute[192716]: 2025-10-07 21:48:19.747 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:48:20 compute-0 nova_compute[192716]: 2025-10-07 21:48:20.795 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:48:20 compute-0 nova_compute[192716]: 2025-10-07 21:48:20.795 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:48:19 up 57 min,  0 user,  load average: 0.00, 0.30, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:48:20 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:20.798 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:48:20 compute-0 nova_compute[192716]: 2025-10-07 21:48:20.816 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:48:21 compute-0 nova_compute[192716]: 2025-10-07 21:48:21.325 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:48:21 compute-0 nova_compute[192716]: 2025-10-07 21:48:21.836 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:48:21 compute-0 nova_compute[192716]: 2025-10-07 21:48:21.837 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:48:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:25.595 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:48:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:25.595 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:48:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:48:25.596 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:48:29 compute-0 sshd-session[214986]: Invalid user github from 103.115.24.11 port 33278
Oct 07 21:48:29 compute-0 sshd-session[214986]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:48:29 compute-0 sshd-session[214986]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 21:48:29 compute-0 podman[214989]: 2025-10-07 21:48:29.138016866 +0000 UTC m=+0.087180376 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd)
Oct 07 21:48:29 compute-0 podman[214988]: 2025-10-07 21:48:29.143785292 +0000 UTC m=+0.097314608 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4)
Oct 07 21:48:29 compute-0 podman[203153]: time="2025-10-07T21:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:48:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:48:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Oct 07 21:48:30 compute-0 sshd-session[214986]: Failed password for invalid user github from 103.115.24.11 port 33278 ssh2
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: ERROR   21:48:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: ERROR   21:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: ERROR   21:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: ERROR   21:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: ERROR   21:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:48:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:48:31 compute-0 sshd-session[214986]: Received disconnect from 103.115.24.11 port 33278:11: Bye Bye [preauth]
Oct 07 21:48:31 compute-0 sshd-session[214986]: Disconnected from invalid user github 103.115.24.11 port 33278 [preauth]
Oct 07 21:48:32 compute-0 podman[215028]: 2025-10-07 21:48:32.836467277 +0000 UTC m=+0.074921642 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:48:39 compute-0 podman[215052]: 2025-10-07 21:48:39.939422801 +0000 UTC m=+0.171981293 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 07 21:48:42 compute-0 podman[215080]: 2025-10-07 21:48:42.835812264 +0000 UTC m=+0.065835819 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Oct 07 21:48:48 compute-0 podman[215099]: 2025-10-07 21:48:48.824234887 +0000 UTC m=+0.065875331 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible)
Oct 07 21:48:59 compute-0 podman[203153]: time="2025-10-07T21:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:48:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:48:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 07 21:48:59 compute-0 podman[215120]: 2025-10-07 21:48:59.82709704 +0000 UTC m=+0.065362731 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true, container_name=iscsid)
Oct 07 21:48:59 compute-0 podman[215121]: 2025-10-07 21:48:59.854837552 +0000 UTC m=+0.083223508 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: ERROR   21:49:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: ERROR   21:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: ERROR   21:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: ERROR   21:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: ERROR   21:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:49:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:49:03 compute-0 podman[215162]: 2025-10-07 21:49:03.850584373 +0000 UTC m=+0.080500732 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:49:10 compute-0 podman[215187]: 2025-10-07 21:49:10.90128517 +0000 UTC m=+0.131210655 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:49:13 compute-0 podman[215213]: 2025-10-07 21:49:13.822574153 +0000 UTC m=+0.068824837 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 07 21:49:16 compute-0 nova_compute[192716]: 2025-10-07 21:49:16.837 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:16 compute-0 nova_compute[192716]: 2025-10-07 21:49:16.837 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:16 compute-0 nova_compute[192716]: 2025-10-07 21:49:16.838 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:16 compute-0 nova_compute[192716]: 2025-10-07 21:49:16.838 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:49:17 compute-0 nova_compute[192716]: 2025-10-07 21:49:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:17 compute-0 nova_compute[192716]: 2025-10-07 21:49:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:17 compute-0 nova_compute[192716]: 2025-10-07 21:49:17.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:18 compute-0 unix_chkpwd[215238]: password check failed for user (root)
Oct 07 21:49:18 compute-0 sshd-session[215235]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=78.128.112.74  user=root
Oct 07 21:49:18 compute-0 nova_compute[192716]: 2025-10-07 21:49:18.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:19 compute-0 sshd-session[215234]: Invalid user plex from 116.110.151.5 port 50552
Oct 07 21:49:19 compute-0 podman[215239]: 2025-10-07 21:49:19.244598569 +0000 UTC m=+0.078838966 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64)
Oct 07 21:49:19 compute-0 nova_compute[192716]: 2025-10-07 21:49:19.495 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:19 compute-0 sshd-session[215234]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:49:19 compute-0 sshd-session[215234]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:49:19 compute-0 sshd-session[215235]: Failed password for root from 78.128.112.74 port 56216 ssh2
Oct 07 21:49:20 compute-0 sshd-session[215235]: Connection closed by authenticating user root 78.128.112.74 port 56216 [preauth]
Oct 07 21:49:20 compute-0 nova_compute[192716]: 2025-10-07 21:49:20.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:49:21 compute-0 sshd-session[215234]: Failed password for invalid user plex from 116.110.151.5 port 50552 ssh2
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.506 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.655 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.656 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.673 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.674 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6080MB free_disk=73.34111785888672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.675 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:49:21 compute-0 nova_compute[192716]: 2025-10-07 21:49:21.675 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:49:21 compute-0 sshd-session[215234]: Connection closed by invalid user plex 116.110.151.5 port 50552 [preauth]
Oct 07 21:49:22 compute-0 nova_compute[192716]: 2025-10-07 21:49:22.763 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:49:22 compute-0 nova_compute[192716]: 2025-10-07 21:49:22.764 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:49:21 up 58 min,  0 user,  load average: 0.00, 0.24, 0.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:49:22 compute-0 nova_compute[192716]: 2025-10-07 21:49:22.795 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:49:23 compute-0 nova_compute[192716]: 2025-10-07 21:49:23.304 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:49:23 compute-0 nova_compute[192716]: 2025-10-07 21:49:23.814 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:49:23 compute-0 nova_compute[192716]: 2025-10-07 21:49:23.815 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:49:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:25.580 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:49:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:25.581 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:49:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:25.597 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:49:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:25.597 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:49:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:25.597 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:49:26 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:26.583 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:49:29 compute-0 podman[203153]: time="2025-10-07T21:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:49:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:49:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 07 21:49:30 compute-0 podman[215264]: 2025-10-07 21:49:30.838781678 +0000 UTC m=+0.060278530 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 07 21:49:30 compute-0 podman[215263]: 2025-10-07 21:49:30.875596623 +0000 UTC m=+0.099439560 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: ERROR   21:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: ERROR   21:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: ERROR   21:49:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: ERROR   21:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: ERROR   21:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:49:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:49:34 compute-0 podman[215303]: 2025-10-07 21:49:34.854933687 +0000 UTC m=+0.090545552 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:49:41 compute-0 podman[215327]: 2025-10-07 21:49:41.906816186 +0000 UTC m=+0.131602765 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:49:44 compute-0 podman[215354]: 2025-10-07 21:49:44.868063312 +0000 UTC m=+0.096064496 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:49:49 compute-0 podman[215375]: 2025-10-07 21:49:49.817204121 +0000 UTC m=+0.064467826 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 21:49:50 compute-0 unix_chkpwd[215396]: password check failed for user (root)
Oct 07 21:49:50 compute-0 sshd-session[215353]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:49:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:51.684 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:06:a7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9eaa040469bd4cefb0eec956f814a225', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e95305b8-51d0-4753-9117-c1612bb00fbf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8f5a7227-c809-4a74-8f29-90ca5d40ed87) old=Port_Binding(mac=['fa:16:3e:7d:06:a7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9eaa040469bd4cefb0eec956f814a225', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:49:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:51.686 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8f5a7227-c809-4a74-8f29-90ca5d40ed87 in datapath beb3e880-54b4-458c-8efd-ef631c65fa1e updated
Oct 07 21:49:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:51.687 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network beb3e880-54b4-458c-8efd-ef631c65fa1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:49:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:51.688 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[abae7e29-4626-4c48-829d-1cd01087bde4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:49:52 compute-0 sshd-session[215353]: Failed password for root from 116.110.151.5 port 36390 ssh2
Oct 07 21:49:55 compute-0 sshd-session[215353]: Connection closed by authenticating user root 116.110.151.5 port 36390 [preauth]
Oct 07 21:49:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:59.092 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:ca:11 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f8cc9524-f685-405d-8bd0-96bc93b36026', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8cc9524-f685-405d-8bd0-96bc93b36026', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2a134798f4845d3adf6745353aa88f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdabb81-cb0a-4e5c-99a6-984cd0dc28eb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=645c6c68-25c4-40e5-9797-150f1b4a08ef) old=Port_Binding(mac=['fa:16:3e:8d:ca:11'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f8cc9524-f685-405d-8bd0-96bc93b36026', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8cc9524-f685-405d-8bd0-96bc93b36026', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2a134798f4845d3adf6745353aa88f1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:49:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:59.093 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 645c6c68-25c4-40e5-9797-150f1b4a08ef in datapath f8cc9524-f685-405d-8bd0-96bc93b36026 updated
Oct 07 21:49:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:59.094 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8cc9524-f685-405d-8bd0-96bc93b36026, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:49:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:49:59.096 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c74558f5-9080-4db5-811f-4bddf894ceb1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:49:59 compute-0 podman[203153]: time="2025-10-07T21:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:49:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:49:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: ERROR   21:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: ERROR   21:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: ERROR   21:50:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: ERROR   21:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: ERROR   21:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:50:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:50:01 compute-0 podman[215398]: 2025-10-07 21:50:01.835499479 +0000 UTC m=+0.070776942 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 21:50:01 compute-0 podman[215397]: 2025-10-07 21:50:01.846219887 +0000 UTC m=+0.082179879 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 21:50:05 compute-0 podman[215438]: 2025-10-07 21:50:05.848163891 +0000 UTC m=+0.083886687 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:50:12 compute-0 podman[215463]: 2025-10-07 21:50:12.860774767 +0000 UTC m=+0.098844554 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 07 21:50:15 compute-0 podman[215489]: 2025-10-07 21:50:15.833618236 +0000 UTC m=+0.068706074 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 21:50:16 compute-0 nova_compute[192716]: 2025-10-07 21:50:16.815 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:16 compute-0 nova_compute[192716]: 2025-10-07 21:50:16.816 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:16 compute-0 nova_compute[192716]: 2025-10-07 21:50:16.816 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:50:16 compute-0 nova_compute[192716]: 2025-10-07 21:50:16.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:17 compute-0 nova_compute[192716]: 2025-10-07 21:50:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:19 compute-0 nova_compute[192716]: 2025-10-07 21:50:19.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:19 compute-0 nova_compute[192716]: 2025-10-07 21:50:19.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:20 compute-0 podman[215509]: 2025-10-07 21:50:20.83361174 +0000 UTC m=+0.070138694 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Oct 07 21:50:20 compute-0 nova_compute[192716]: 2025-10-07 21:50:20.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:20 compute-0 nova_compute[192716]: 2025-10-07 21:50:20.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.506 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.727 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.728 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.749 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.750 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6094MB free_disk=73.34166717529297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.750 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:21 compute-0 nova_compute[192716]: 2025-10-07 21:50:21.751 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:22 compute-0 nova_compute[192716]: 2025-10-07 21:50:22.806 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:50:22 compute-0 nova_compute[192716]: 2025-10-07 21:50:22.807 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:50:21 up 59 min,  0 user,  load average: 0.00, 0.20, 0.41\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:50:22 compute-0 nova_compute[192716]: 2025-10-07 21:50:22.828 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:50:23 compute-0 nova_compute[192716]: 2025-10-07 21:50:23.334 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:50:23 compute-0 nova_compute[192716]: 2025-10-07 21:50:23.843 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:50:23 compute-0 nova_compute[192716]: 2025-10-07 21:50:23.843 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:25.598 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:25.599 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:25.599 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:29 compute-0 podman[203153]: time="2025-10-07T21:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:50:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:50:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 07 21:50:30 compute-0 unix_chkpwd[215535]: password check failed for user (root)
Oct 07 21:50:30 compute-0 sshd-session[215533]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: ERROR   21:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: ERROR   21:50:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: ERROR   21:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: ERROR   21:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: ERROR   21:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:50:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:50:31 compute-0 nova_compute[192716]: 2025-10-07 21:50:31.734 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:31 compute-0 nova_compute[192716]: 2025-10-07 21:50:31.734 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:32 compute-0 nova_compute[192716]: 2025-10-07 21:50:32.239 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 21:50:32 compute-0 sshd-session[215533]: Failed password for root from 116.110.151.5 port 34362 ssh2
Oct 07 21:50:32 compute-0 podman[215536]: 2025-10-07 21:50:32.825975296 +0000 UTC m=+0.064790646 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20251007, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 07 21:50:32 compute-0 nova_compute[192716]: 2025-10-07 21:50:32.831 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:32 compute-0 nova_compute[192716]: 2025-10-07 21:50:32.832 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:32 compute-0 podman[215537]: 2025-10-07 21:50:32.835520101 +0000 UTC m=+0.066879103 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 21:50:32 compute-0 nova_compute[192716]: 2025-10-07 21:50:32.838 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 21:50:32 compute-0 nova_compute[192716]: 2025-10-07 21:50:32.838 2 INFO nova.compute.claims [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Claim successful on node compute-0.ctlplane.example.com
Oct 07 21:50:33 compute-0 sshd-session[215533]: Connection closed by authenticating user root 116.110.151.5 port 34362 [preauth]
Oct 07 21:50:33 compute-0 nova_compute[192716]: 2025-10-07 21:50:33.955 2 DEBUG nova.compute.provider_tree [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:50:34 compute-0 nova_compute[192716]: 2025-10-07 21:50:34.461 2 DEBUG nova.scheduler.client.report [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:50:34 compute-0 nova_compute[192716]: 2025-10-07 21:50:34.979 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:34 compute-0 nova_compute[192716]: 2025-10-07 21:50:34.980 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 21:50:35 compute-0 nova_compute[192716]: 2025-10-07 21:50:35.493 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 21:50:35 compute-0 nova_compute[192716]: 2025-10-07 21:50:35.493 2 DEBUG nova.network.neutron [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 21:50:35 compute-0 nova_compute[192716]: 2025-10-07 21:50:35.495 2 WARNING neutronclient.v2_0.client [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:35 compute-0 nova_compute[192716]: 2025-10-07 21:50:35.498 2 WARNING neutronclient.v2_0.client [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:36 compute-0 nova_compute[192716]: 2025-10-07 21:50:36.007 2 INFO nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 21:50:36 compute-0 nova_compute[192716]: 2025-10-07 21:50:36.517 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 21:50:36 compute-0 podman[215574]: 2025-10-07 21:50:36.851322462 +0000 UTC m=+0.080714789 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:50:36 compute-0 nova_compute[192716]: 2025-10-07 21:50:36.870 2 DEBUG nova.network.neutron [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Successfully created port: db056e33-1774-4cfd-86d7-b31a8e48eeec _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.534 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.536 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.536 2 INFO nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Creating image(s)
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.537 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "/var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.537 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "/var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.537 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "/var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.538 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.538 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.892 2 DEBUG nova.network.neutron [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Successfully updated port: db056e33-1774-4cfd-86d7-b31a8e48eeec _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.983 2 DEBUG nova.compute.manager [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-changed-db056e33-1774-4cfd-86d7-b31a8e48eeec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.983 2 DEBUG nova.compute.manager [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Refreshing instance network info cache due to event network-changed-db056e33-1774-4cfd-86d7-b31a8e48eeec. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.983 2 DEBUG oslo_concurrency.lockutils [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-992a0107-b0cb-4907-8d2b-bd9d32808982" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.984 2 DEBUG oslo_concurrency.lockutils [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-992a0107-b0cb-4907-8d2b-bd9d32808982" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:50:37 compute-0 nova_compute[192716]: 2025-10-07 21:50:37.984 2 DEBUG nova.network.neutron [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Refreshing network info cache for port db056e33-1774-4cfd-86d7-b31a8e48eeec _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:50:38 compute-0 nova_compute[192716]: 2025-10-07 21:50:38.398 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "refresh_cache-992a0107-b0cb-4907-8d2b-bd9d32808982" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:50:38 compute-0 nova_compute[192716]: 2025-10-07 21:50:38.491 2 WARNING neutronclient.v2_0.client [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:38 compute-0 nova_compute[192716]: 2025-10-07 21:50:38.857 2 DEBUG nova.network.neutron [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.133 2 DEBUG nova.network.neutron [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.235 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.238 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.238 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.298 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.299 2 DEBUG nova.virt.images [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] c40cab67-7e52-4762-b275-de0efa24bdf4 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.300 2 DEBUG nova.privsep.utils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.301 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.part /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.716 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.part /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.converted" returned: 0 in 0.415s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.722 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.786 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71.converted --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.787 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.248s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.787 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.792 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:50:39 compute-0 nova_compute[192716]: 2025-10-07 21:50:39.794 2 INFO oslo.privsep.daemon [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpu_k5ogvh/privsep.sock']
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.084 2 DEBUG oslo_concurrency.lockutils [req-dc184786-72ee-4899-b827-1b28fdcda4ad req-ee35d45e-1a80-4571-a504-ee13b51d0fa0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-992a0107-b0cb-4907-8d2b-bd9d32808982" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.085 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquired lock "refresh_cache-992a0107-b0cb-4907-8d2b-bd9d32808982" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.085 2 DEBUG nova.network.neutron [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.536 2 INFO oslo.privsep.daemon [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Spawned new privsep daemon via rootwrap
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.388 65 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.392 65 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.394 65 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.394 65 INFO oslo.privsep.daemon [-] privsep daemon running as pid 65
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.632 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.667 2 DEBUG nova.network.neutron [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.699 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.700 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.701 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.701 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.709 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.710 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.784 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.785 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.826 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.827 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.827 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.894 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.895 2 DEBUG nova.virt.disk.api [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Checking if we can resize image /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.895 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.956 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.957 2 DEBUG nova.virt.disk.api [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Cannot resize image /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.958 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.959 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Ensure instance console log exists: /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.959 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.959 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:40 compute-0 nova_compute[192716]: 2025-10-07 21:50:40.960 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:41 compute-0 nova_compute[192716]: 2025-10-07 21:50:41.753 2 WARNING neutronclient.v2_0.client [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:42 compute-0 nova_compute[192716]: 2025-10-07 21:50:42.773 2 DEBUG nova.network.neutron [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Updating instance_info_cache with network_info: [{"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.279 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Releasing lock "refresh_cache-992a0107-b0cb-4907-8d2b-bd9d32808982" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.280 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Instance network_info: |[{"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.282 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Start _get_guest_xml network_info=[{"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.285 2 WARNING nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.287 2 DEBUG nova.virt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-553344788', uuid='992a0107-b0cb-4907-8d2b-bd9d32808982'), owner=OwnerMeta(userid='a94ce99e27c1418a9a1adbaee490bca1', username='tempest-TestDataModel-950169491-project-admin', projectid='d2a134798f4845d3adf6745353aa88f1', projectname='tempest-TestDataModel-950169491'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759873843.287156) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.293 2 DEBUG nova.virt.libvirt.host [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.293 2 DEBUG nova.virt.libvirt.host [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.295 2 DEBUG nova.virt.libvirt.host [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.296 2 DEBUG nova.virt.libvirt.host [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.296 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.296 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.297 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.297 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.297 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.297 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.298 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.298 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.298 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.298 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.298 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.299 2 DEBUG nova.virt.hardware [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.302 2 DEBUG nova.privsep.utils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.303 2 DEBUG nova.virt.libvirt.vif [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-553344788',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-553344788',id=3,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2a134798f4845d3adf6745353aa88f1',ramdisk_id='',reservation_id='r-kuj8y9y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-950169491',owner_user_name='tempest-TestDataModel-950169491-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:50:36Z,user_data=None,user_id='a94ce99e27c1418a9a1adbaee490bca1',uuid=992a0107-b0cb-4907-8d2b-bd9d32808982,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.303 2 DEBUG nova.network.os_vif_util [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Converting VIF {"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.304 2 DEBUG nova.network.os_vif_util [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.305 2 DEBUG nova.objects.instance [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 992a0107-b0cb-4907-8d2b-bd9d32808982 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.819 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] End _get_guest_xml xml=<domain type="kvm">
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <uuid>992a0107-b0cb-4907-8d2b-bd9d32808982</uuid>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <name>instance-00000003</name>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:name>tempest-TestDataModel-server-553344788</nova:name>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:50:43</nova:creationTime>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:50:43 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:50:43 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:user uuid="a94ce99e27c1418a9a1adbaee490bca1">tempest-TestDataModel-950169491-project-admin</nova:user>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:project uuid="d2a134798f4845d3adf6745353aa88f1">tempest-TestDataModel-950169491</nova:project>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         <nova:port uuid="db056e33-1774-4cfd-86d7-b31a8e48eeec">
Oct 07 21:50:43 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <system>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <entry name="serial">992a0107-b0cb-4907-8d2b-bd9d32808982</entry>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <entry name="uuid">992a0107-b0cb-4907-8d2b-bd9d32808982</entry>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </system>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <os>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </os>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <features>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </features>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.config"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:b9:d3:e7"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <target dev="tapdb056e33-17"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/console.log" append="off"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <video>
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </video>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:50:43 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:50:43 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:50:43 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:50:43 compute-0 nova_compute[192716]: </domain>
Oct 07 21:50:43 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.820 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Preparing to wait for external event network-vif-plugged-db056e33-1774-4cfd-86d7-b31a8e48eeec prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.820 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.820 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.821 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.822 2 DEBUG nova.virt.libvirt.vif [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-553344788',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-553344788',id=3,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2a134798f4845d3adf6745353aa88f1',ramdisk_id='',reservation_id='r-kuj8y9y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-950169491',owner_user_name='tempest-TestDataModel-950169491-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:50:36Z,user_data=None,user_id='a94ce99e27c1418a9a1adbaee490bca1',uuid=992a0107-b0cb-4907-8d2b-bd9d32808982,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.822 2 DEBUG nova.network.os_vif_util [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Converting VIF {"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.823 2 DEBUG nova.network.os_vif_util [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.824 2 DEBUG os_vif [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:50:43 compute-0 podman[215633]: 2025-10-07 21:50:43.877910348 +0000 UTC m=+0.110754425 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.879 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.880 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.880 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.893 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.893 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.894 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4864f82c-5d95-55a0-9be5-54206812bfbb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:43 compute-0 nova_compute[192716]: 2025-10-07 21:50:43.898 2 INFO oslo.privsep.daemon [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp4bkwdn3h/privsep.sock']
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.727 2 INFO oslo.privsep.daemon [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Spawned new privsep daemon via rootwrap
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.546 86 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.552 86 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.557 86 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.557 86 INFO oslo.privsep.daemon [-] privsep daemon running as pid 86
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb056e33-17, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.972 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdb056e33-17, col_values=(('qos', UUID('e70d5918-a7e8-4f3b-b5bc-44d81f2e7862')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:44 compute-0 nova_compute[192716]: 2025-10-07 21:50:44.973 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdb056e33-17, col_values=(('external_ids', {'iface-id': 'db056e33-1774-4cfd-86d7-b31a8e48eeec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:d3:e7', 'vm-uuid': '992a0107-b0cb-4907-8d2b-bd9d32808982'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:45 compute-0 nova_compute[192716]: 2025-10-07 21:50:45.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:45 compute-0 NetworkManager[51722]: <info>  [1759873845.0085] manager: (tapdb056e33-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct 07 21:50:45 compute-0 nova_compute[192716]: 2025-10-07 21:50:45.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:50:45 compute-0 nova_compute[192716]: 2025-10-07 21:50:45.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:45 compute-0 nova_compute[192716]: 2025-10-07 21:50:45.016 2 INFO os_vif [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17')
Oct 07 21:50:46 compute-0 nova_compute[192716]: 2025-10-07 21:50:46.557 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:50:46 compute-0 nova_compute[192716]: 2025-10-07 21:50:46.557 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:50:46 compute-0 nova_compute[192716]: 2025-10-07 21:50:46.558 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] No VIF found with MAC fa:16:3e:b9:d3:e7, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 21:50:46 compute-0 nova_compute[192716]: 2025-10-07 21:50:46.559 2 INFO nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Using config drive
Oct 07 21:50:46 compute-0 podman[215669]: 2025-10-07 21:50:46.833855446 +0000 UTC m=+0.070449043 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 21:50:47 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.073 2 WARNING neutronclient.v2_0.client [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:47 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.669 2 INFO nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Creating config drive at /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.config
Oct 07 21:50:47 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.677 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmp3yq5wodl execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:50:47 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.802 2 DEBUG oslo_concurrency.processutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmp3yq5wodl" returned: 0 in 0.126s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:50:47 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 07 21:50:47 compute-0 kernel: tapdb056e33-17: entered promiscuous mode
Oct 07 21:50:47 compute-0 NetworkManager[51722]: <info>  [1759873847.8949] manager: (tapdb056e33-17): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Oct 07 21:50:47 compute-0 ovn_controller[94904]: 2025-10-07T21:50:47Z|00040|binding|INFO|Claiming lport db056e33-1774-4cfd-86d7-b31a8e48eeec for this chassis.
Oct 07 21:50:47 compute-0 ovn_controller[94904]: 2025-10-07T21:50:47Z|00041|binding|INFO|db056e33-1774-4cfd-86d7-b31a8e48eeec: Claiming fa:16:3e:b9:d3:e7 10.100.0.3
Oct 07 21:50:47 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:47 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.920 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:d3:e7 10.100.0.3'], port_security=['fa:16:3e:b9:d3:e7 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '992a0107-b0cb-4907-8d2b-bd9d32808982', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2a134798f4845d3adf6745353aa88f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5dcc3416-b47f-443e-889e-b5db7d51b5c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e95305b8-51d0-4753-9117-c1612bb00fbf, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=db056e33-1774-4cfd-86d7-b31a8e48eeec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.922 103791 INFO neutron.agent.ovn.metadata.agent [-] Port db056e33-1774-4cfd-86d7-b31a8e48eeec in datapath beb3e880-54b4-458c-8efd-ef631c65fa1e bound to our chassis
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.923 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network beb3e880-54b4-458c-8efd-ef631c65fa1e
Oct 07 21:50:47 compute-0 systemd-udevd[215710]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.959 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a64f6fce-062d-476a-93c3-e5ed6f5b8c9e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.960 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbeb3e880-51 in ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.965 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbeb3e880-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.965 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8aeae262-5e1c-48a3-80ea-2e545fcd7a1d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.967 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d6893a5b-cc4f-4424-874a-73866edca977]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:47 compute-0 NetworkManager[51722]: <info>  [1759873847.9710] device (tapdb056e33-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:50:47 compute-0 NetworkManager[51722]: <info>  [1759873847.9729] device (tapdb056e33-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:50:47 compute-0 systemd-machined[152719]: New machine qemu-1-instance-00000003.
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:47.997 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[8628e9b2-65e4-47ec-b3e6-ceab8aa35b4d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:47.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:48 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Oct 07 21:50:48 compute-0 ovn_controller[94904]: 2025-10-07T21:50:48Z|00042|binding|INFO|Setting lport db056e33-1774-4cfd-86d7-b31a8e48eeec ovn-installed in OVS
Oct 07 21:50:48 compute-0 ovn_controller[94904]: 2025-10-07T21:50:48Z|00043|binding|INFO|Setting lport db056e33-1774-4cfd-86d7-b31a8e48eeec up in Southbound
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.006 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2714d4ba-cf39-4049-99a9-6cf36b7415e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.007 103791 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpetpkbcva/privsep.sock']
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.189 2 DEBUG nova.compute.manager [req-ce8f0547-4f56-4fac-a949-b0c5508157bf req-eaf77e26-c973-416e-9f9c-59b145063f37 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-plugged-db056e33-1774-4cfd-86d7-b31a8e48eeec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.190 2 DEBUG oslo_concurrency.lockutils [req-ce8f0547-4f56-4fac-a949-b0c5508157bf req-eaf77e26-c973-416e-9f9c-59b145063f37 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.190 2 DEBUG oslo_concurrency.lockutils [req-ce8f0547-4f56-4fac-a949-b0c5508157bf req-eaf77e26-c973-416e-9f9c-59b145063f37 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.191 2 DEBUG oslo_concurrency.lockutils [req-ce8f0547-4f56-4fac-a949-b0c5508157bf req-eaf77e26-c973-416e-9f9c-59b145063f37 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.192 2 DEBUG nova.compute.manager [req-ce8f0547-4f56-4fac-a949-b0c5508157bf req-eaf77e26-c973-416e-9f9c-59b145063f37 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Processing event network-vif-plugged-db056e33-1774-4cfd-86d7-b31a8e48eeec _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.677 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.761 103791 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.761 103791 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpetpkbcva/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.618 215740 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.621 215740 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.623 215740 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.623 215740 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215740
Oct 07 21:50:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:48.763 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd098a4-83f6-4533-ad21-3b9f965fe39f]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.808 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.820 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.824 2 INFO nova.virt.libvirt.driver [-] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Instance spawned successfully.
Oct 07 21:50:48 compute-0 nova_compute[192716]: 2025-10-07 21:50:48.824 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.206 215740 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.206 215740 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.206 215740 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.338 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.339 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.339 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.340 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.341 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.341 2 DEBUG nova.virt.libvirt.driver [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.628 215740 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.633 215740 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.702 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[77d6b578-c281-4349-8760-e26389360e21]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.708 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7f74c3fa-26d5-4847-b159-458c8b8ce39d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 NetworkManager[51722]: <info>  [1759873849.7093] manager: (tapbeb3e880-50): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 07 21:50:49 compute-0 systemd-udevd[215713]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.744 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee0ecb0-deb9-46e1-ba95-529d8f779759]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.747 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc55a83-1608-43d2-a82d-b73feb97b502]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 NetworkManager[51722]: <info>  [1759873849.7704] device (tapbeb3e880-50): carrier: link connected
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.776 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2faf44-d6d3-4184-9a8b-0599a6fb4cdc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.796 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[87a6fdab-7d4c-41af-a46b-d9b2bd7ee396]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbeb3e880-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:06:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358335, 'reachable_time': 27732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215766, 'error': None, 'target': 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.813 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ee01b6-9ccc-457c-a74e-ebd696406cf3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7d:6a7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358335, 'tstamp': 358335}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215767, 'error': None, 'target': 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.832 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[63977425-0893-443c-9244-4bfb2611cbc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbeb3e880-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:06:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358335, 'reachable_time': 27732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215768, 'error': None, 'target': 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.853 2 INFO nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Took 12.32 seconds to spawn the instance on the hypervisor.
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.856 2 DEBUG nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.870 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1507c4-7c4f-402f-84dd-bf2aeab3920f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.941 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0946a270-1ee4-4e44-aa91-d01f59e01cf3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.942 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb3e880-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.942 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.943 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbeb3e880-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:49 compute-0 kernel: tapbeb3e880-50: entered promiscuous mode
Oct 07 21:50:49 compute-0 NetworkManager[51722]: <info>  [1759873849.9469] manager: (tapbeb3e880-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.949 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbeb3e880-50, col_values=(('external_ids', {'iface-id': '8f5a7227-c809-4a74-8f29-90ca5d40ed87'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:49 compute-0 ovn_controller[94904]: 2025-10-07T21:50:49Z|00044|binding|INFO|Releasing lport 8f5a7227-c809-4a74-8f29-90ca5d40ed87 from this chassis (sb_readonly=0)
Oct 07 21:50:49 compute-0 nova_compute[192716]: 2025-10-07 21:50:49.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.962 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a6429f-377b-48c1-a70c-c21624c9324b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.962 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.963 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.963 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for beb3e880-54b4-458c-8efd-ef631c65fa1e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.963 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.963 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9b94dd-4781-4c2a-874c-8e192eca55d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.964 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.964 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1cf67a0f-9f11-4ad6-a899-f6d3b8cf442d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.964 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: global
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-beb3e880-54b4-458c-8efd-ef631c65fa1e
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID beb3e880-54b4-458c-8efd-ef631c65fa1e
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 21:50:49 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:49.965 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'env', 'PROCESS_TAG=haproxy-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/beb3e880-54b4-458c-8efd-ef631c65fa1e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.261 2 DEBUG nova.compute.manager [req-568831d6-edc6-44e6-9666-778c596e305b req-eff70c7b-4fa0-447f-84ed-3d0ab48ad3cb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-plugged-db056e33-1774-4cfd-86d7-b31a8e48eeec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.262 2 DEBUG oslo_concurrency.lockutils [req-568831d6-edc6-44e6-9666-778c596e305b req-eff70c7b-4fa0-447f-84ed-3d0ab48ad3cb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.262 2 DEBUG oslo_concurrency.lockutils [req-568831d6-edc6-44e6-9666-778c596e305b req-eff70c7b-4fa0-447f-84ed-3d0ab48ad3cb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.262 2 DEBUG oslo_concurrency.lockutils [req-568831d6-edc6-44e6-9666-778c596e305b req-eff70c7b-4fa0-447f-84ed-3d0ab48ad3cb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.263 2 DEBUG nova.compute.manager [req-568831d6-edc6-44e6-9666-778c596e305b req-eff70c7b-4fa0-447f-84ed-3d0ab48ad3cb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] No waiting events found dispatching network-vif-plugged-db056e33-1774-4cfd-86d7-b31a8e48eeec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.263 2 WARNING nova.compute.manager [req-568831d6-edc6-44e6-9666-778c596e305b req-eff70c7b-4fa0-447f-84ed-3d0ab48ad3cb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received unexpected event network-vif-plugged-db056e33-1774-4cfd-86d7-b31a8e48eeec for instance with vm_state active and task_state None.
Oct 07 21:50:50 compute-0 podman[215801]: 2025-10-07 21:50:50.370478773 +0000 UTC m=+0.053891122 container create f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, org.label-schema.build-date=20251007, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.389 2 INFO nova.compute.manager [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Took 17.64 seconds to build instance.
Oct 07 21:50:50 compute-0 systemd[1]: Started libpod-conmon-f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad.scope.
Oct 07 21:50:50 compute-0 podman[215801]: 2025-10-07 21:50:50.341692021 +0000 UTC m=+0.025104410 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 21:50:50 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:50:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b9e4e3327e6edaa6c05e69d316ab2ed269bb311fb1c4c5de24b8278985faa84/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 21:50:50 compute-0 podman[215801]: 2025-10-07 21:50:50.463212175 +0000 UTC m=+0.146624594 container init f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:50:50 compute-0 podman[215801]: 2025-10-07 21:50:50.471083754 +0000 UTC m=+0.154496133 container start f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 21:50:50 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [NOTICE]   (215821) : New worker (215823) forked
Oct 07 21:50:50 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [NOTICE]   (215821) : Loading success.
Oct 07 21:50:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:50.643 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:50:50 compute-0 nova_compute[192716]: 2025-10-07 21:50:50.894 2 DEBUG oslo_concurrency.lockutils [None req-7b5d63fc-b716-4a3d-a440-bad2954f4a40 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.160s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:51 compute-0 podman[215833]: 2025-10-07 21:50:51.867632839 +0000 UTC m=+0.095878361 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 21:50:54 compute-0 nova_compute[192716]: 2025-10-07 21:50:54.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:55 compute-0 nova_compute[192716]: 2025-10-07 21:50:55.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.253 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.254 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.254 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.254 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.254 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.266 2 INFO nova.compute.manager [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Terminating instance
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.781 2 DEBUG nova.compute.manager [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 21:50:56 compute-0 kernel: tapdb056e33-17 (unregistering): left promiscuous mode
Oct 07 21:50:56 compute-0 NetworkManager[51722]: <info>  [1759873856.8110] device (tapdb056e33-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:50:56 compute-0 ovn_controller[94904]: 2025-10-07T21:50:56Z|00045|binding|INFO|Releasing lport db056e33-1774-4cfd-86d7-b31a8e48eeec from this chassis (sb_readonly=0)
Oct 07 21:50:56 compute-0 ovn_controller[94904]: 2025-10-07T21:50:56Z|00046|binding|INFO|Setting lport db056e33-1774-4cfd-86d7-b31a8e48eeec down in Southbound
Oct 07 21:50:56 compute-0 ovn_controller[94904]: 2025-10-07T21:50:56Z|00047|binding|INFO|Removing iface tapdb056e33-17 ovn-installed in OVS
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:56.831 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:d3:e7 10.100.0.3'], port_security=['fa:16:3e:b9:d3:e7 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '992a0107-b0cb-4907-8d2b-bd9d32808982', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2a134798f4845d3adf6745353aa88f1', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5dcc3416-b47f-443e-889e-b5db7d51b5c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e95305b8-51d0-4753-9117-c1612bb00fbf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=db056e33-1774-4cfd-86d7-b31a8e48eeec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:50:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:56.832 103791 INFO neutron.agent.ovn.metadata.agent [-] Port db056e33-1774-4cfd-86d7-b31a8e48eeec in datapath beb3e880-54b4-458c-8efd-ef631c65fa1e unbound from our chassis
Oct 07 21:50:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:56.833 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network beb3e880-54b4-458c-8efd-ef631c65fa1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:50:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:56.834 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7bde0600-d0e5-45a9-857d-61a9420ed157]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:56.835 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e namespace which is not needed anymore
Oct 07 21:50:56 compute-0 nova_compute[192716]: 2025-10-07 21:50:56.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:56 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct 07 21:50:56 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 8.805s CPU time.
Oct 07 21:50:56 compute-0 systemd-machined[152719]: Machine qemu-1-instance-00000003 terminated.
Oct 07 21:50:56 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [NOTICE]   (215821) : haproxy version is 3.0.5-8e879a5
Oct 07 21:50:56 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [NOTICE]   (215821) : path to executable is /usr/sbin/haproxy
Oct 07 21:50:56 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [WARNING]  (215821) : Exiting Master process...
Oct 07 21:50:56 compute-0 podman[215879]: 2025-10-07 21:50:56.964218554 +0000 UTC m=+0.031861899 container kill f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 07 21:50:56 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [ALERT]    (215821) : Current worker (215823) exited with code 143 (Terminated)
Oct 07 21:50:56 compute-0 neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e[215817]: [WARNING]  (215821) : All workers exited. Exiting... (0)
Oct 07 21:50:56 compute-0 systemd[1]: libpod-f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad.scope: Deactivated successfully.
Oct 07 21:50:56 compute-0 conmon[215817]: conmon f5dd49c71968bafe14c1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad.scope/container/memory.events
Oct 07 21:50:57 compute-0 podman[215894]: 2025-10-07 21:50:57.008823926 +0000 UTC m=+0.024047941 container died f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 21:50:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad-userdata-shm.mount: Deactivated successfully.
Oct 07 21:50:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-2b9e4e3327e6edaa6c05e69d316ab2ed269bb311fb1c4c5de24b8278985faa84-merged.mount: Deactivated successfully.
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.052 2 INFO nova.virt.libvirt.driver [-] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Instance destroyed successfully.
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.053 2 DEBUG nova.objects.instance [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lazy-loading 'resources' on Instance uuid 992a0107-b0cb-4907-8d2b-bd9d32808982 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:50:57 compute-0 podman[215894]: 2025-10-07 21:50:57.057972434 +0000 UTC m=+0.073196439 container cleanup f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:50:57 compute-0 systemd[1]: libpod-conmon-f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad.scope: Deactivated successfully.
Oct 07 21:50:57 compute-0 podman[215896]: 2025-10-07 21:50:57.076742267 +0000 UTC m=+0.082057936 container remove f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.092 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7bf759-4222-4d8b-8bde-d35b5acbfd29]: (4, ("Tue Oct  7 09:50:56 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e (f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad)\nf5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad\nTue Oct  7 09:50:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e (f5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad)\nf5dd49c71968bafe14c18bf9fd4b91c39f7e61f93696c7592dc1d39f411eccad\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.094 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b146e631-6995-4097-a92b-4898e0fbb819]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.095 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/beb3e880-54b4-458c-8efd-ef631c65fa1e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.096 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ebccd564-ae7c-4392-aa68-044fa2342219]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.096 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb3e880-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 kernel: tapbeb3e880-50: left promiscuous mode
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.122 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[97a194a2-fa6a-404d-bdf2-5a030703348b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.149 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3a599eee-30d8-417a-ada7-918f3f010db7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.151 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6bf5a7-fef3-42d4-aa17-786326e5c876]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.178 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[aa245372-02b3-43cf-8df4-3750af1d1701]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358327, 'reachable_time': 39907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215947, 'error': None, 'target': 'ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.185 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-beb3e880-54b4-458c-8efd-ef631c65fa1e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 21:50:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:57.186 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[0b99f67c-70b9-4a99-8c79-e666ed0f309c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:50:57 compute-0 systemd[1]: run-netns-ovnmeta\x2dbeb3e880\x2d54b4\x2d458c\x2d8efd\x2def631c65fa1e.mount: Deactivated successfully.
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.560 2 DEBUG nova.virt.libvirt.vif [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-553344788',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-553344788',id=3,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:50:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d2a134798f4845d3adf6745353aa88f1',ramdisk_id='',reservation_id='r-kuj8y9y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-950169491',owner_user_name='tempest-TestDataModel-950169491-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:50:49Z,user_data=None,user_id='a94ce99e27c1418a9a1adbaee490bca1',uuid=992a0107-b0cb-4907-8d2b-bd9d32808982,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.561 2 DEBUG nova.network.os_vif_util [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Converting VIF {"id": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "address": "fa:16:3e:b9:d3:e7", "network": {"id": "beb3e880-54b4-458c-8efd-ef631c65fa1e", "bridge": "br-int", "label": "tempest-TestDataModel-1675957073-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9eaa040469bd4cefb0eec956f814a225", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb056e33-17", "ovs_interfaceid": "db056e33-1774-4cfd-86d7-b31a8e48eeec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.562 2 DEBUG nova.network.os_vif_util [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.562 2 DEBUG os_vif [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb056e33-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.570 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e70d5918-a7e8-4f3b-b5bc-44d81f2e7862) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.575 2 INFO os_vif [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:d3:e7,bridge_name='br-int',has_traffic_filtering=True,id=db056e33-1774-4cfd-86d7-b31a8e48eeec,network=Network(beb3e880-54b4-458c-8efd-ef631c65fa1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb056e33-17')
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.576 2 INFO nova.virt.libvirt.driver [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Deleting instance files /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982_del
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.577 2 INFO nova.virt.libvirt.driver [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Deletion of /var/lib/nova/instances/992a0107-b0cb-4907-8d2b-bd9d32808982_del complete
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.750 2 DEBUG nova.compute.manager [req-e5f5a583-c3ff-43ca-a717-7a96d3a3c251 req-31895980-a204-43fd-ac9a-c4a62ed24168 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-unplugged-db056e33-1774-4cfd-86d7-b31a8e48eeec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.750 2 DEBUG oslo_concurrency.lockutils [req-e5f5a583-c3ff-43ca-a717-7a96d3a3c251 req-31895980-a204-43fd-ac9a-c4a62ed24168 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.751 2 DEBUG oslo_concurrency.lockutils [req-e5f5a583-c3ff-43ca-a717-7a96d3a3c251 req-31895980-a204-43fd-ac9a-c4a62ed24168 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.751 2 DEBUG oslo_concurrency.lockutils [req-e5f5a583-c3ff-43ca-a717-7a96d3a3c251 req-31895980-a204-43fd-ac9a-c4a62ed24168 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.752 2 DEBUG nova.compute.manager [req-e5f5a583-c3ff-43ca-a717-7a96d3a3c251 req-31895980-a204-43fd-ac9a-c4a62ed24168 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] No waiting events found dispatching network-vif-unplugged-db056e33-1774-4cfd-86d7-b31a8e48eeec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:50:57 compute-0 nova_compute[192716]: 2025-10-07 21:50:57.752 2 DEBUG nova.compute.manager [req-e5f5a583-c3ff-43ca-a717-7a96d3a3c251 req-31895980-a204-43fd-ac9a-c4a62ed24168 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-unplugged-db056e33-1774-4cfd-86d7-b31a8e48eeec for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:50:58 compute-0 nova_compute[192716]: 2025-10-07 21:50:58.088 2 INFO nova.compute.manager [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 07 21:50:58 compute-0 nova_compute[192716]: 2025-10-07 21:50:58.089 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 21:50:58 compute-0 nova_compute[192716]: 2025-10-07 21:50:58.089 2 DEBUG nova.compute.manager [-] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 21:50:58 compute-0 nova_compute[192716]: 2025-10-07 21:50:58.090 2 DEBUG nova.network.neutron [-] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 21:50:58 compute-0 nova_compute[192716]: 2025-10-07 21:50:58.090 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:58 compute-0 nova_compute[192716]: 2025-10-07 21:50:58.724 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.109 2 DEBUG nova.compute.manager [req-f7cf3cc6-95e4-4b4a-8c5c-0389a4738c69 req-20a38f15-940c-4f7f-9182-b8bbe1889673 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-deleted-db056e33-1774-4cfd-86d7-b31a8e48eeec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.110 2 INFO nova.compute.manager [req-f7cf3cc6-95e4-4b4a-8c5c-0389a4738c69 req-20a38f15-940c-4f7f-9182-b8bbe1889673 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Neutron deleted interface db056e33-1774-4cfd-86d7-b31a8e48eeec; detaching it from the instance and deleting it from the info cache
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.110 2 DEBUG nova.network.neutron [req-f7cf3cc6-95e4-4b4a-8c5c-0389a4738c69 req-20a38f15-940c-4f7f-9182-b8bbe1889673 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.560 2 DEBUG nova.network.neutron [-] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.618 2 DEBUG nova.compute.manager [req-f7cf3cc6-95e4-4b4a-8c5c-0389a4738c69 req-20a38f15-940c-4f7f-9182-b8bbe1889673 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Detach interface failed, port_id=db056e33-1774-4cfd-86d7-b31a8e48eeec, reason: Instance 992a0107-b0cb-4907-8d2b-bd9d32808982 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 21:50:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:50:59.644 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:50:59 compute-0 podman[203153]: time="2025-10-07T21:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:50:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:50:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2985 "" "Go-http-client/1.1"
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.804 2 DEBUG nova.compute.manager [req-5f9067bc-0803-4d4f-a55f-eaf8599e4134 req-b43838f6-fdf4-4fd6-a316-424ad5d974ff 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-unplugged-db056e33-1774-4cfd-86d7-b31a8e48eeec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.805 2 DEBUG oslo_concurrency.lockutils [req-5f9067bc-0803-4d4f-a55f-eaf8599e4134 req-b43838f6-fdf4-4fd6-a316-424ad5d974ff 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.806 2 DEBUG oslo_concurrency.lockutils [req-5f9067bc-0803-4d4f-a55f-eaf8599e4134 req-b43838f6-fdf4-4fd6-a316-424ad5d974ff 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.806 2 DEBUG oslo_concurrency.lockutils [req-5f9067bc-0803-4d4f-a55f-eaf8599e4134 req-b43838f6-fdf4-4fd6-a316-424ad5d974ff 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.806 2 DEBUG nova.compute.manager [req-5f9067bc-0803-4d4f-a55f-eaf8599e4134 req-b43838f6-fdf4-4fd6-a316-424ad5d974ff 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] No waiting events found dispatching network-vif-unplugged-db056e33-1774-4cfd-86d7-b31a8e48eeec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:50:59 compute-0 nova_compute[192716]: 2025-10-07 21:50:59.806 2 DEBUG nova.compute.manager [req-5f9067bc-0803-4d4f-a55f-eaf8599e4134 req-b43838f6-fdf4-4fd6-a316-424ad5d974ff 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Received event network-vif-unplugged-db056e33-1774-4cfd-86d7-b31a8e48eeec for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:51:00 compute-0 nova_compute[192716]: 2025-10-07 21:51:00.070 2 INFO nova.compute.manager [-] [instance: 992a0107-b0cb-4907-8d2b-bd9d32808982] Took 1.98 seconds to deallocate network for instance.
Oct 07 21:51:00 compute-0 nova_compute[192716]: 2025-10-07 21:51:00.590 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:51:00 compute-0 nova_compute[192716]: 2025-10-07 21:51:00.591 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:51:00 compute-0 nova_compute[192716]: 2025-10-07 21:51:00.652 2 DEBUG nova.compute.provider_tree [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.188 2 ERROR nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] [req-9ceea5de-6e10-4dc5-9c2e-d59cc26c08ed] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 19d1aa8e-e3fb-43ab-9849-122569e48a32.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-9ceea5de-6e10-4dc5-9c2e-d59cc26c08ed"}]}
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.204 2 DEBUG nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.216 2 DEBUG nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.216 2 DEBUG nova.compute.provider_tree [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.230 2 DEBUG nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.245 2 DEBUG nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.280 2 DEBUG nova.compute.provider_tree [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: ERROR   21:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: ERROR   21:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: ERROR   21:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: ERROR   21:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: ERROR   21:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:51:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:51:01 compute-0 anacron[4762]: Job `cron.daily' started
Oct 07 21:51:01 compute-0 anacron[4762]: Job `cron.daily' terminated
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.826 2 DEBUG nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updated inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.827 2 DEBUG nova.compute.provider_tree [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 21:51:01 compute-0 nova_compute[192716]: 2025-10-07 21:51:01.827 2 DEBUG nova.compute.provider_tree [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:51:02 compute-0 nova_compute[192716]: 2025-10-07 21:51:02.337 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.746s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:51:02 compute-0 nova_compute[192716]: 2025-10-07 21:51:02.396 2 INFO nova.scheduler.client.report [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Deleted allocations for instance 992a0107-b0cb-4907-8d2b-bd9d32808982
Oct 07 21:51:02 compute-0 nova_compute[192716]: 2025-10-07 21:51:02.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:03 compute-0 nova_compute[192716]: 2025-10-07 21:51:03.432 2 DEBUG oslo_concurrency.lockutils [None req-4d6cceab-39e0-44a1-9f51-d27a56543935 a94ce99e27c1418a9a1adbaee490bca1 d2a134798f4845d3adf6745353aa88f1 - - default default] Lock "992a0107-b0cb-4907-8d2b-bd9d32808982" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.179s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:51:03 compute-0 podman[215954]: 2025-10-07 21:51:03.859458584 +0000 UTC m=+0.088065196 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 21:51:03 compute-0 podman[215953]: 2025-10-07 21:51:03.881024383 +0000 UTC m=+0.105344586 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 07 21:51:04 compute-0 nova_compute[192716]: 2025-10-07 21:51:04.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:07 compute-0 nova_compute[192716]: 2025-10-07 21:51:07.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:07 compute-0 podman[215993]: 2025-10-07 21:51:07.827312861 +0000 UTC m=+0.059729380 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:51:09 compute-0 nova_compute[192716]: 2025-10-07 21:51:09.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:10 compute-0 sshd-session[216017]: Invalid user admin from 116.110.151.5 port 39254
Oct 07 21:51:10 compute-0 sshd-session[216017]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:51:10 compute-0 sshd-session[216017]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:51:11 compute-0 nova_compute[192716]: 2025-10-07 21:51:11.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:12 compute-0 nova_compute[192716]: 2025-10-07 21:51:12.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:12 compute-0 sshd-session[216017]: Failed password for invalid user admin from 116.110.151.5 port 39254 ssh2
Oct 07 21:51:14 compute-0 nova_compute[192716]: 2025-10-07 21:51:14.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:14 compute-0 sshd-session[216017]: Connection closed by invalid user admin 116.110.151.5 port 39254 [preauth]
Oct 07 21:51:14 compute-0 podman[216019]: 2025-10-07 21:51:14.874286525 +0000 UTC m=+0.110604073 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 21:51:16 compute-0 nova_compute[192716]: 2025-10-07 21:51:16.498 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:16 compute-0 nova_compute[192716]: 2025-10-07 21:51:16.498 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:51:16 compute-0 nova_compute[192716]: 2025-10-07 21:51:16.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:16 compute-0 nova_compute[192716]: 2025-10-07 21:51:16.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 21:51:17 compute-0 nova_compute[192716]: 2025-10-07 21:51:17.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:17 compute-0 nova_compute[192716]: 2025-10-07 21:51:17.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:17 compute-0 podman[216047]: 2025-10-07 21:51:17.882657279 +0000 UTC m=+0.111734274 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 21:51:18 compute-0 nova_compute[192716]: 2025-10-07 21:51:18.498 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:18 compute-0 nova_compute[192716]: 2025-10-07 21:51:18.499 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:18 compute-0 nova_compute[192716]: 2025-10-07 21:51:18.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:19 compute-0 nova_compute[192716]: 2025-10-07 21:51:19.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:19 compute-0 nova_compute[192716]: 2025-10-07 21:51:19.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:20 compute-0 sshd-session[216045]: Invalid user admin from 116.110.151.5 port 45806
Oct 07 21:51:20 compute-0 nova_compute[192716]: 2025-10-07 21:51:20.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:21 compute-0 sshd-session[216045]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:51:21 compute-0 sshd-session[216045]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:51:21 compute-0 nova_compute[192716]: 2025-10-07 21:51:21.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:22 compute-0 nova_compute[192716]: 2025-10-07 21:51:22.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:22 compute-0 nova_compute[192716]: 2025-10-07 21:51:22.497 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 21:51:22 compute-0 nova_compute[192716]: 2025-10-07 21:51:22.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:22 compute-0 podman[216066]: 2025-10-07 21:51:22.859902204 +0000 UTC m=+0.088338524 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible)
Oct 07 21:51:23 compute-0 nova_compute[192716]: 2025-10-07 21:51:23.002 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 21:51:23 compute-0 sshd-session[216045]: Failed password for invalid user admin from 116.110.151.5 port 45806 ssh2
Oct 07 21:51:23 compute-0 nova_compute[192716]: 2025-10-07 21:51:23.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:23 compute-0 nova_compute[192716]: 2025-10-07 21:51:23.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.007 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.008 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.008 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.008 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.211 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.212 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.242 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.243 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5856MB free_disk=73.30743026733398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.243 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:51:24 compute-0 nova_compute[192716]: 2025-10-07 21:51:24.243 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:51:24 compute-0 sshd-session[216045]: Connection closed by invalid user admin 116.110.151.5 port 45806 [preauth]
Oct 07 21:51:25 compute-0 nova_compute[192716]: 2025-10-07 21:51:25.296 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:51:25 compute-0 nova_compute[192716]: 2025-10-07 21:51:25.296 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:51:24 up  1:00,  0 user,  load average: 0.23, 0.22, 0.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:51:25 compute-0 nova_compute[192716]: 2025-10-07 21:51:25.326 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:51:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:25.600 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:51:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:25.600 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:51:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:25.601 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:51:25 compute-0 nova_compute[192716]: 2025-10-07 21:51:25.832 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:51:26 compute-0 nova_compute[192716]: 2025-10-07 21:51:26.341 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:51:26 compute-0 nova_compute[192716]: 2025-10-07 21:51:26.342 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:51:27 compute-0 nova_compute[192716]: 2025-10-07 21:51:27.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:29 compute-0 nova_compute[192716]: 2025-10-07 21:51:29.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:29 compute-0 podman[203153]: time="2025-10-07T21:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:51:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:51:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2988 "" "Go-http-client/1.1"
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: ERROR   21:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: ERROR   21:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: ERROR   21:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: ERROR   21:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: ERROR   21:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:51:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:51:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:31.831 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:94:9a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97ef6e0949aa4dd8b3ac7e1495d532e1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c0a40c81-05dd-4977-aaa2-2a56498aa3a2) old=Port_Binding(mac=['fa:16:3e:36:94:9a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97ef6e0949aa4dd8b3ac7e1495d532e1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:51:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:31.832 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c0a40c81-05dd-4977-aaa2-2a56498aa3a2 in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e updated
Oct 07 21:51:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:31.833 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0bd9c95-1d58-40c0-8d62-097453d85d3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:51:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:31.834 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[86df1afc-8c16-4ebe-96a4-1ac98cba1d75]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:51:32 compute-0 nova_compute[192716]: 2025-10-07 21:51:32.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:34 compute-0 nova_compute[192716]: 2025-10-07 21:51:34.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:34 compute-0 podman[216093]: 2025-10-07 21:51:34.819419706 +0000 UTC m=+0.058437484 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251007)
Oct 07 21:51:34 compute-0 podman[216092]: 2025-10-07 21:51:34.822704717 +0000 UTC m=+0.064516642 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 21:51:34 compute-0 sshd-session[216090]: Invalid user admin from 116.110.151.5 port 60330
Oct 07 21:51:35 compute-0 sshd-session[216090]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:51:35 compute-0 sshd-session[216090]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:51:37 compute-0 nova_compute[192716]: 2025-10-07 21:51:37.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:37 compute-0 sshd-session[216090]: Failed password for invalid user admin from 116.110.151.5 port 60330 ssh2
Oct 07 21:51:38 compute-0 podman[216131]: 2025-10-07 21:51:38.804680806 +0000 UTC m=+0.049023523 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:51:39 compute-0 nova_compute[192716]: 2025-10-07 21:51:39.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:40 compute-0 sshd-session[216090]: Connection closed by invalid user admin 116.110.151.5 port 60330 [preauth]
Oct 07 21:51:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:41.847 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:bb:93 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ebae3973-2fc1-453e-9c70-67dfb3af02f2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ebae3973-2fc1-453e-9c70-67dfb3af02f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b94da4e2-d506-461d-b697-0c574644551f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=498be2fd-f540-40e4-b197-d1e78be9de09) old=Port_Binding(mac=['fa:16:3e:46:bb:93'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ebae3973-2fc1-453e-9c70-67dfb3af02f2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ebae3973-2fc1-453e-9c70-67dfb3af02f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:51:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:41.848 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 498be2fd-f540-40e4-b197-d1e78be9de09 in datapath ebae3973-2fc1-453e-9c70-67dfb3af02f2 updated
Oct 07 21:51:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:41.849 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ebae3973-2fc1-453e-9c70-67dfb3af02f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:51:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:51:41.849 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c968a6-11c3-40a0-bf88-54f00241d6b5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:51:42 compute-0 nova_compute[192716]: 2025-10-07 21:51:42.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:44 compute-0 nova_compute[192716]: 2025-10-07 21:51:44.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:45 compute-0 podman[216156]: 2025-10-07 21:51:45.897993898 +0000 UTC m=+0.117510555 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251007)
Oct 07 21:51:47 compute-0 nova_compute[192716]: 2025-10-07 21:51:47.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:48 compute-0 podman[216182]: 2025-10-07 21:51:48.805692656 +0000 UTC m=+0.052508440 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 07 21:51:49 compute-0 nova_compute[192716]: 2025-10-07 21:51:49.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:52 compute-0 nova_compute[192716]: 2025-10-07 21:51:52.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:52 compute-0 ovn_controller[94904]: 2025-10-07T21:51:52Z|00048|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 07 21:51:53 compute-0 podman[216201]: 2025-10-07 21:51:53.821020398 +0000 UTC m=+0.064655626 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Oct 07 21:51:54 compute-0 nova_compute[192716]: 2025-10-07 21:51:54.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:57 compute-0 nova_compute[192716]: 2025-10-07 21:51:57.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:59 compute-0 nova_compute[192716]: 2025-10-07 21:51:59.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:51:59 compute-0 podman[203153]: time="2025-10-07T21:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:51:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:51:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2988 "" "Go-http-client/1.1"
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: ERROR   21:52:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: ERROR   21:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: ERROR   21:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: ERROR   21:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: ERROR   21:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:52:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:52:02 compute-0 nova_compute[192716]: 2025-10-07 21:52:02.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:04 compute-0 nova_compute[192716]: 2025-10-07 21:52:04.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:05 compute-0 podman[216223]: 2025-10-07 21:52:05.842521121 +0000 UTC m=+0.078858712 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible)
Oct 07 21:52:05 compute-0 podman[216224]: 2025-10-07 21:52:05.860923112 +0000 UTC m=+0.086624357 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 21:52:07 compute-0 nova_compute[192716]: 2025-10-07 21:52:07.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:08 compute-0 nova_compute[192716]: 2025-10-07 21:52:08.575 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:09 compute-0 nova_compute[192716]: 2025-10-07 21:52:09.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:09 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:09.735 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:52:09 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:09.736 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:52:09 compute-0 nova_compute[192716]: 2025-10-07 21:52:09.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:09 compute-0 podman[216262]: 2025-10-07 21:52:09.877890682 +0000 UTC m=+0.101563912 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:52:12 compute-0 nova_compute[192716]: 2025-10-07 21:52:12.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:14 compute-0 nova_compute[192716]: 2025-10-07 21:52:14.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:16 compute-0 podman[216287]: 2025-10-07 21:52:16.904567253 +0000 UTC m=+0.137266743 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Oct 07 21:52:17 compute-0 nova_compute[192716]: 2025-10-07 21:52:17.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:17.738 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:17 compute-0 nova_compute[192716]: 2025-10-07 21:52:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:17 compute-0 nova_compute[192716]: 2025-10-07 21:52:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:17 compute-0 nova_compute[192716]: 2025-10-07 21:52:17.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:17 compute-0 nova_compute[192716]: 2025-10-07 21:52:17.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:52:19 compute-0 nova_compute[192716]: 2025-10-07 21:52:19.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:19 compute-0 podman[216313]: 2025-10-07 21:52:19.845424913 +0000 UTC m=+0.070934261 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:52:20 compute-0 nova_compute[192716]: 2025-10-07 21:52:20.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:21 compute-0 nova_compute[192716]: 2025-10-07 21:52:21.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:21 compute-0 nova_compute[192716]: 2025-10-07 21:52:21.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:22 compute-0 nova_compute[192716]: 2025-10-07 21:52:22.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:23 compute-0 nova_compute[192716]: 2025-10-07 21:52:23.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:24 compute-0 nova_compute[192716]: 2025-10-07 21:52:24.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:24 compute-0 podman[216332]: 2025-10-07 21:52:24.867315896 +0000 UTC m=+0.108083033 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 07 21:52:24 compute-0 nova_compute[192716]: 2025-10-07 21:52:24.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.502 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:52:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:25.601 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:25.602 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:25.602 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.688 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.690 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.714 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.715 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5902MB free_disk=73.3073959350586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.715 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:25 compute-0 nova_compute[192716]: 2025-10-07 21:52:25.716 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:26 compute-0 nova_compute[192716]: 2025-10-07 21:52:26.800 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:52:26 compute-0 nova_compute[192716]: 2025-10-07 21:52:26.800 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:52:25 up  1:01,  0 user,  load average: 0.08, 0.18, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:52:26 compute-0 nova_compute[192716]: 2025-10-07 21:52:26.876 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:52:27 compute-0 nova_compute[192716]: 2025-10-07 21:52:27.387 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:52:27 compute-0 nova_compute[192716]: 2025-10-07 21:52:27.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:27 compute-0 nova_compute[192716]: 2025-10-07 21:52:27.913 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:52:27 compute-0 nova_compute[192716]: 2025-10-07 21:52:27.914 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.199s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:29 compute-0 nova_compute[192716]: 2025-10-07 21:52:29.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:29 compute-0 podman[203153]: time="2025-10-07T21:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:52:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:52:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2993 "" "Go-http-client/1.1"
Oct 07 21:52:30 compute-0 nova_compute[192716]: 2025-10-07 21:52:30.227 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:30 compute-0 nova_compute[192716]: 2025-10-07 21:52:30.227 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:30 compute-0 nova_compute[192716]: 2025-10-07 21:52:30.733 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 21:52:31 compute-0 nova_compute[192716]: 2025-10-07 21:52:31.303 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:31 compute-0 nova_compute[192716]: 2025-10-07 21:52:31.304 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:31 compute-0 nova_compute[192716]: 2025-10-07 21:52:31.315 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 21:52:31 compute-0 nova_compute[192716]: 2025-10-07 21:52:31.315 2 INFO nova.compute.claims [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Claim successful on node compute-0.ctlplane.example.com
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: ERROR   21:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: ERROR   21:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: ERROR   21:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: ERROR   21:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: ERROR   21:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:52:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:52:32 compute-0 nova_compute[192716]: 2025-10-07 21:52:32.380 2 DEBUG nova.compute.provider_tree [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:52:32 compute-0 nova_compute[192716]: 2025-10-07 21:52:32.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:32 compute-0 nova_compute[192716]: 2025-10-07 21:52:32.888 2 DEBUG nova.scheduler.client.report [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:52:33 compute-0 nova_compute[192716]: 2025-10-07 21:52:33.399 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:33 compute-0 nova_compute[192716]: 2025-10-07 21:52:33.401 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 21:52:33 compute-0 nova_compute[192716]: 2025-10-07 21:52:33.914 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 21:52:33 compute-0 nova_compute[192716]: 2025-10-07 21:52:33.915 2 DEBUG nova.network.neutron [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 21:52:33 compute-0 nova_compute[192716]: 2025-10-07 21:52:33.915 2 WARNING neutronclient.v2_0.client [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:52:33 compute-0 nova_compute[192716]: 2025-10-07 21:52:33.916 2 WARNING neutronclient.v2_0.client [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:52:34 compute-0 nova_compute[192716]: 2025-10-07 21:52:34.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:34 compute-0 nova_compute[192716]: 2025-10-07 21:52:34.422 2 INFO nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 21:52:34 compute-0 nova_compute[192716]: 2025-10-07 21:52:34.932 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.393 2 DEBUG nova.network.neutron [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Successfully created port: 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.953 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.955 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.956 2 INFO nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Creating image(s)
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.957 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "/var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.957 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "/var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.958 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "/var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.960 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.967 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:52:35 compute-0 nova_compute[192716]: 2025-10-07 21:52:35.969 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.045 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.048 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.050 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.051 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.058 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.059 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.127 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.129 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.186 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.187 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.188 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.240 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.242 2 DEBUG nova.virt.disk.api [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Checking if we can resize image /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.242 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.297 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.299 2 DEBUG nova.virt.disk.api [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Cannot resize image /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.300 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.301 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Ensure instance console log exists: /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.302 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.302 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.303 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.750 2 DEBUG nova.network.neutron [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Successfully updated port: 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.840 2 DEBUG nova.compute.manager [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-changed-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.841 2 DEBUG nova.compute.manager [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Refreshing instance network info cache due to event network-changed-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.841 2 DEBUG oslo_concurrency.lockutils [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-5138bd92-9a6e-4088-b0b2-bee3a14683ac" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.841 2 DEBUG oslo_concurrency.lockutils [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-5138bd92-9a6e-4088-b0b2-bee3a14683ac" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:52:36 compute-0 nova_compute[192716]: 2025-10-07 21:52:36.842 2 DEBUG nova.network.neutron [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Refreshing network info cache for port 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:52:36 compute-0 podman[216370]: 2025-10-07 21:52:36.861592374 +0000 UTC m=+0.098164036 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 21:52:36 compute-0 podman[216371]: 2025-10-07 21:52:36.862495199 +0000 UTC m=+0.092827168 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct 07 21:52:37 compute-0 nova_compute[192716]: 2025-10-07 21:52:37.258 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "refresh_cache-5138bd92-9a6e-4088-b0b2-bee3a14683ac" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:52:37 compute-0 nova_compute[192716]: 2025-10-07 21:52:37.351 2 WARNING neutronclient.v2_0.client [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:52:37 compute-0 nova_compute[192716]: 2025-10-07 21:52:37.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:37 compute-0 nova_compute[192716]: 2025-10-07 21:52:37.709 2 DEBUG nova.network.neutron [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:52:37 compute-0 nova_compute[192716]: 2025-10-07 21:52:37.926 2 DEBUG nova.network.neutron [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:52:38 compute-0 nova_compute[192716]: 2025-10-07 21:52:38.432 2 DEBUG oslo_concurrency.lockutils [req-ddb7a35e-e0d9-4baa-8c2c-73e672a879b5 req-0113d778-7b63-49e2-a6d5-859d5d380608 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-5138bd92-9a6e-4088-b0b2-bee3a14683ac" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:52:38 compute-0 nova_compute[192716]: 2025-10-07 21:52:38.433 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquired lock "refresh_cache-5138bd92-9a6e-4088-b0b2-bee3a14683ac" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:52:38 compute-0 nova_compute[192716]: 2025-10-07 21:52:38.433 2 DEBUG nova.network.neutron [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:52:39 compute-0 nova_compute[192716]: 2025-10-07 21:52:39.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:39 compute-0 nova_compute[192716]: 2025-10-07 21:52:39.749 2 DEBUG nova.network.neutron [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:52:40 compute-0 nova_compute[192716]: 2025-10-07 21:52:40.727 2 WARNING neutronclient.v2_0.client [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:52:40 compute-0 podman[216411]: 2025-10-07 21:52:40.833704109 +0000 UTC m=+0.063729961 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.179 2 DEBUG nova.network.neutron [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Updating instance_info_cache with network_info: [{"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.685 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Releasing lock "refresh_cache-5138bd92-9a6e-4088-b0b2-bee3a14683ac" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.685 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Instance network_info: |[{"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.689 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Start _get_guest_xml network_info=[{"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.694 2 WARNING nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.696 2 DEBUG nova.virt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1913099881', uuid='5138bd92-9a6e-4088-b0b2-bee3a14683ac'), owner=OwnerMeta(userid='b71b837a81994b9694ede764e0406ac8', username='tempest-TestExecuteActionsViaActuator-1409880739-project-admin', projectid='42e6cb8a77b54158b2345b916b6fd79b', projectname='tempest-TestExecuteActionsViaActuator-1409880739'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759873961.696491) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.701 2 DEBUG nova.virt.libvirt.host [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.701 2 DEBUG nova.virt.libvirt.host [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.704 2 DEBUG nova.virt.libvirt.host [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.704 2 DEBUG nova.virt.libvirt.host [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.705 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.705 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.705 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.706 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.706 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.706 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.706 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.706 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.707 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.707 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.707 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.707 2 DEBUG nova.virt.hardware [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.710 2 DEBUG nova.virt.libvirt.vif [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:52:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1913099881',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1913099881',id=5,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-0x9q5j10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:52:34Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=5138bd92-9a6e-4088-b0b2-bee3a14683ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.710 2 DEBUG nova.network.os_vif_util [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.711 2 DEBUG nova.network.os_vif_util [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:52:41 compute-0 nova_compute[192716]: 2025-10-07 21:52:41.712 2 DEBUG nova.objects.instance [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'pci_devices' on Instance uuid 5138bd92-9a6e-4088-b0b2-bee3a14683ac obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.221 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] End _get_guest_xml xml=<domain type="kvm">
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <uuid>5138bd92-9a6e-4088-b0b2-bee3a14683ac</uuid>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <name>instance-00000005</name>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1913099881</nova:name>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:52:41</nova:creationTime>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:52:42 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:52:42 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:user uuid="b71b837a81994b9694ede764e0406ac8">tempest-TestExecuteActionsViaActuator-1409880739-project-admin</nova:user>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:project uuid="42e6cb8a77b54158b2345b916b6fd79b">tempest-TestExecuteActionsViaActuator-1409880739</nova:project>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         <nova:port uuid="0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce">
Oct 07 21:52:42 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <system>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <entry name="serial">5138bd92-9a6e-4088-b0b2-bee3a14683ac</entry>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <entry name="uuid">5138bd92-9a6e-4088-b0b2-bee3a14683ac</entry>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </system>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <os>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </os>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <features>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </features>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.config"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:fb:65:92"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <target dev="tap0f47f8fd-d8"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/console.log" append="off"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <video>
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </video>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:52:42 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:52:42 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:52:42 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:52:42 compute-0 nova_compute[192716]: </domain>
Oct 07 21:52:42 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.222 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Preparing to wait for external event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.222 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.223 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.223 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.224 2 DEBUG nova.virt.libvirt.vif [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:52:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1913099881',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1913099881',id=5,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-0x9q5j10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:52:34Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=5138bd92-9a6e-4088-b0b2-bee3a14683ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.224 2 DEBUG nova.network.os_vif_util [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.225 2 DEBUG nova.network.os_vif_util [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.225 2 DEBUG os_vif [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a3231f06-98c9-5583-a3a2-de102fdcf8f3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f47f8fd-d8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0f47f8fd-d8, col_values=(('qos', UUID('0bd7538c-443f-46e3-a06a-8cc1ba8bded3')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0f47f8fd-d8, col_values=(('external_ids', {'iface-id': '0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:65:92', 'vm-uuid': '5138bd92-9a6e-4088-b0b2-bee3a14683ac'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 NetworkManager[51722]: <info>  [1759873962.2381] manager: (tap0f47f8fd-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:42 compute-0 nova_compute[192716]: 2025-10-07 21:52:42.248 2 INFO os_vif [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8')
Oct 07 21:52:43 compute-0 nova_compute[192716]: 2025-10-07 21:52:43.790 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:52:43 compute-0 nova_compute[192716]: 2025-10-07 21:52:43.791 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:52:43 compute-0 nova_compute[192716]: 2025-10-07 21:52:43.791 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No VIF found with MAC fa:16:3e:fb:65:92, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 21:52:43 compute-0 nova_compute[192716]: 2025-10-07 21:52:43.792 2 INFO nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Using config drive
Oct 07 21:52:44 compute-0 nova_compute[192716]: 2025-10-07 21:52:44.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:44 compute-0 nova_compute[192716]: 2025-10-07 21:52:44.310 2 WARNING neutronclient.v2_0.client [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:52:44 compute-0 nova_compute[192716]: 2025-10-07 21:52:44.801 2 INFO nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Creating config drive at /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.config
Oct 07 21:52:44 compute-0 nova_compute[192716]: 2025-10-07 21:52:44.810 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmp_y2rgl2p execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:52:44 compute-0 nova_compute[192716]: 2025-10-07 21:52:44.952 2 DEBUG oslo_concurrency.processutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmp_y2rgl2p" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:52:45 compute-0 kernel: tap0f47f8fd-d8: entered promiscuous mode
Oct 07 21:52:45 compute-0 NetworkManager[51722]: <info>  [1759873965.0312] manager: (tap0f47f8fd-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Oct 07 21:52:45 compute-0 ovn_controller[94904]: 2025-10-07T21:52:45Z|00049|binding|INFO|Claiming lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for this chassis.
Oct 07 21:52:45 compute-0 ovn_controller[94904]: 2025-10-07T21:52:45Z|00050|binding|INFO|0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce: Claiming fa:16:3e:fb:65:92 10.100.0.9
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.076 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:65:92 10.100.0.9'], port_security=['fa:16:3e:fb:65:92 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5138bd92-9a6e-4088-b0b2-bee3a14683ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.077 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e bound to our chassis
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.079 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:52:45 compute-0 systemd-udevd[216458]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:52:45 compute-0 systemd-machined[152719]: New machine qemu-2-instance-00000005.
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.090 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[74964e2d-2ecf-4369-b0dc-e98a3497c0e9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.091 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf0bd9c95-11 in ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.096 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf0bd9c95-10 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.096 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[92d3282a-cd98-4e4a-989a-636b93d13245]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.097 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[47fc2b17-6bc2-4142-884e-304716ce82ba]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 NetworkManager[51722]: <info>  [1759873965.1044] device (tap0f47f8fd-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:52:45 compute-0 NetworkManager[51722]: <info>  [1759873965.1060] device (tap0f47f8fd-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.116 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[c1edd2e9-f2b2-4f2b-b327-349eb98110b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.132 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9cefae5a-2ecd-47ad-9062-d4daa6cd8ac3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 ovn_controller[94904]: 2025-10-07T21:52:45Z|00051|binding|INFO|Setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce ovn-installed in OVS
Oct 07 21:52:45 compute-0 ovn_controller[94904]: 2025-10-07T21:52:45Z|00052|binding|INFO|Setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce up in Southbound
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.166 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c32937-2dac-45e3-919d-e0d3e39d6dc4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.171 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[19bd780d-98ad-491c-b2e0-75878309b85c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 NetworkManager[51722]: <info>  [1759873965.1730] manager: (tapf0bd9c95-10): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.210 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b931cf-def6-42a3-b324-acef4b411b9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.213 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc7e88a-2d15-41bf-bcc7-f5670c31e6a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 NetworkManager[51722]: <info>  [1759873965.2470] device (tapf0bd9c95-10): carrier: link connected
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.254 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fa82ba-6bd1-4607-9219-628e848f16e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.275 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d9634647-13d6-4843-912a-e5ccff7ebf41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 19733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216491, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.292 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[eb726ae2-baa1-494f-b9fb-b39984a811b5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:949a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369882, 'tstamp': 369882}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216492, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.314 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d53f59a7-7a45-486e-8c39-37b7488ccc69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 19733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216493, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.316 2 DEBUG nova.compute.manager [req-f2b1a67b-9190-4677-b9ec-505ad154d58b req-c9213b4f-389d-4785-b792-b4a323677a8e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.316 2 DEBUG oslo_concurrency.lockutils [req-f2b1a67b-9190-4677-b9ec-505ad154d58b req-c9213b4f-389d-4785-b792-b4a323677a8e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.317 2 DEBUG oslo_concurrency.lockutils [req-f2b1a67b-9190-4677-b9ec-505ad154d58b req-c9213b4f-389d-4785-b792-b4a323677a8e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.317 2 DEBUG oslo_concurrency.lockutils [req-f2b1a67b-9190-4677-b9ec-505ad154d58b req-c9213b4f-389d-4785-b792-b4a323677a8e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.317 2 DEBUG nova.compute.manager [req-f2b1a67b-9190-4677-b9ec-505ad154d58b req-c9213b4f-389d-4785-b792-b4a323677a8e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Processing event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.349 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[397fd918-bff5-4e28-a242-3c712ffbeb96]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.415 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f96a1f9c-f14d-46fb-805f-179a7f3fea13]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.417 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.417 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.418 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 NetworkManager[51722]: <info>  [1759873965.4206] manager: (tapf0bd9c95-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct 07 21:52:45 compute-0 kernel: tapf0bd9c95-10: entered promiscuous mode
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.425 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 ovn_controller[94904]: 2025-10-07T21:52:45Z|00053|binding|INFO|Releasing lport c0a40c81-05dd-4977-aaa2-2a56498aa3a2 from this chassis (sb_readonly=0)
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.429 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9a7d9060-c8cc-4736-a272-f5a288304c73]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.429 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.430 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.430 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f0bd9c95-1d58-40c0-8d62-097453d85d3e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.430 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.431 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fb36cbd3-858e-4ec5-a46d-9f2e5646ea4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.432 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.432 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb8b2f0-d494-4d50-8828-064638e911e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.433 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: global
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 21:52:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:52:45.434 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'env', 'PROCESS_TAG=haproxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f0bd9c95-1d58-40c0-8d62-097453d85d3e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 21:52:45 compute-0 nova_compute[192716]: 2025-10-07 21:52:45.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:45 compute-0 podman[216532]: 2025-10-07 21:52:45.855340856 +0000 UTC m=+0.055916964 container create e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:52:45 compute-0 systemd[1]: Started libpod-conmon-e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6.scope.
Oct 07 21:52:45 compute-0 podman[216532]: 2025-10-07 21:52:45.821747663 +0000 UTC m=+0.022323791 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 21:52:45 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:52:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efa3994277d38f899a53625af013c8b4fd470a62ea7c1116dfc5debd5a91effa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 21:52:45 compute-0 podman[216532]: 2025-10-07 21:52:45.96604333 +0000 UTC m=+0.166619428 container init e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:52:45 compute-0 podman[216532]: 2025-10-07 21:52:45.974376891 +0000 UTC m=+0.174952969 container start e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 21:52:46 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [NOTICE]   (216550) : New worker (216552) forked
Oct 07 21:52:46 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [NOTICE]   (216550) : Loading success.
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.235 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.239 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.243 2 INFO nova.virt.libvirt.driver [-] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Instance spawned successfully.
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.244 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.758 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.759 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.760 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.760 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.761 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:52:46 compute-0 nova_compute[192716]: 2025-10-07 21:52:46.761 2 DEBUG nova.virt.libvirt.driver [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.272 2 INFO nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Took 11.32 seconds to spawn the instance on the hypervisor.
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.273 2 DEBUG nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.541 2 DEBUG nova.compute.manager [req-b7e966dd-d832-4852-b52d-baea3918e84a req-63f61cf9-55f4-4650-8417-e6ac6c5a0832 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.541 2 DEBUG oslo_concurrency.lockutils [req-b7e966dd-d832-4852-b52d-baea3918e84a req-63f61cf9-55f4-4650-8417-e6ac6c5a0832 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.542 2 DEBUG oslo_concurrency.lockutils [req-b7e966dd-d832-4852-b52d-baea3918e84a req-63f61cf9-55f4-4650-8417-e6ac6c5a0832 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.542 2 DEBUG oslo_concurrency.lockutils [req-b7e966dd-d832-4852-b52d-baea3918e84a req-63f61cf9-55f4-4650-8417-e6ac6c5a0832 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.542 2 DEBUG nova.compute.manager [req-b7e966dd-d832-4852-b52d-baea3918e84a req-63f61cf9-55f4-4650-8417-e6ac6c5a0832 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.543 2 WARNING nova.compute.manager [req-b7e966dd-d832-4852-b52d-baea3918e84a req-63f61cf9-55f4-4650-8417-e6ac6c5a0832 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received unexpected event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with vm_state active and task_state None.
Oct 07 21:52:47 compute-0 nova_compute[192716]: 2025-10-07 21:52:47.815 2 INFO nova.compute.manager [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Took 16.57 seconds to build instance.
Oct 07 21:52:47 compute-0 podman[216561]: 2025-10-07 21:52:47.893026586 +0000 UTC m=+0.137233091 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 07 21:52:48 compute-0 nova_compute[192716]: 2025-10-07 21:52:48.327 2 DEBUG oslo_concurrency.lockutils [None req-c0ad3dd1-a40c-4ee8-89c4-1cae01449280 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:52:49 compute-0 nova_compute[192716]: 2025-10-07 21:52:49.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:50 compute-0 podman[216586]: 2025-10-07 21:52:50.852937965 +0000 UTC m=+0.077031570 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:52:52 compute-0 nova_compute[192716]: 2025-10-07 21:52:52.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:54 compute-0 nova_compute[192716]: 2025-10-07 21:52:54.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:55 compute-0 podman[216606]: 2025-10-07 21:52:55.871216198 +0000 UTC m=+0.090159364 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc.)
Oct 07 21:52:56 compute-0 sshd-session[216438]: ssh_dispatch_run_fatal: Connection from 27.79.44.171 port 34138: Connection timed out [preauth]
Oct 07 21:52:57 compute-0 nova_compute[192716]: 2025-10-07 21:52:57.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:57 compute-0 ovn_controller[94904]: 2025-10-07T21:52:57Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fb:65:92 10.100.0.9
Oct 07 21:52:57 compute-0 ovn_controller[94904]: 2025-10-07T21:52:57Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fb:65:92 10.100.0.9
Oct 07 21:52:59 compute-0 nova_compute[192716]: 2025-10-07 21:52:59.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:52:59 compute-0 podman[203153]: time="2025-10-07T21:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:52:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:52:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3462 "" "Go-http-client/1.1"
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: ERROR   21:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: ERROR   21:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: ERROR   21:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: ERROR   21:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: ERROR   21:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:53:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:53:02 compute-0 nova_compute[192716]: 2025-10-07 21:53:02.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:04 compute-0 nova_compute[192716]: 2025-10-07 21:53:04.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:05 compute-0 unix_chkpwd[216641]: password check failed for user (root)
Oct 07 21:53:05 compute-0 sshd-session[216639]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 07 21:53:07 compute-0 nova_compute[192716]: 2025-10-07 21:53:07.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:07 compute-0 podman[216642]: 2025-10-07 21:53:07.885999464 +0000 UTC m=+0.112186316 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 21:53:07 compute-0 sshd-session[216639]: Failed password for root from 193.46.255.99 port 28622 ssh2
Oct 07 21:53:07 compute-0 podman[216643]: 2025-10-07 21:53:07.901110174 +0000 UTC m=+0.118739288 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:53:09 compute-0 nova_compute[192716]: 2025-10-07 21:53:09.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:09 compute-0 unix_chkpwd[216684]: password check failed for user (root)
Oct 07 21:53:09 compute-0 sshd-session[216682]: Invalid user admin from 103.115.24.11 port 35202
Oct 07 21:53:09 compute-0 sshd-session[216682]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:53:09 compute-0 sshd-session[216682]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 21:53:11 compute-0 sshd-session[216639]: Failed password for root from 193.46.255.99 port 28622 ssh2
Oct 07 21:53:11 compute-0 sshd-session[216682]: Failed password for invalid user admin from 103.115.24.11 port 35202 ssh2
Oct 07 21:53:11 compute-0 podman[216685]: 2025-10-07 21:53:11.877743513 +0000 UTC m=+0.105401349 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:53:11 compute-0 sshd-session[216682]: Received disconnect from 103.115.24.11 port 35202:11: Bye Bye [preauth]
Oct 07 21:53:11 compute-0 sshd-session[216682]: Disconnected from invalid user admin 103.115.24.11 port 35202 [preauth]
Oct 07 21:53:11 compute-0 unix_chkpwd[216710]: password check failed for user (root)
Oct 07 21:53:12 compute-0 nova_compute[192716]: 2025-10-07 21:53:12.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:14 compute-0 sshd-session[216639]: Failed password for root from 193.46.255.99 port 28622 ssh2
Oct 07 21:53:14 compute-0 nova_compute[192716]: 2025-10-07 21:53:14.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:16 compute-0 sshd-session[216639]: Received disconnect from 193.46.255.99 port 28622:11:  [preauth]
Oct 07 21:53:16 compute-0 sshd-session[216639]: Disconnected from authenticating user root 193.46.255.99 port 28622 [preauth]
Oct 07 21:53:16 compute-0 sshd-session[216639]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 07 21:53:17 compute-0 unix_chkpwd[216713]: password check failed for user (root)
Oct 07 21:53:17 compute-0 sshd-session[216711]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 07 21:53:17 compute-0 nova_compute[192716]: 2025-10-07 21:53:17.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:18 compute-0 podman[216714]: 2025-10-07 21:53:18.906355029 +0000 UTC m=+0.133640935 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:53:18 compute-0 sshd-session[216711]: Failed password for root from 193.46.255.99 port 61266 ssh2
Oct 07 21:53:19 compute-0 nova_compute[192716]: 2025-10-07 21:53:19.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:19 compute-0 unix_chkpwd[216740]: password check failed for user (root)
Oct 07 21:53:20 compute-0 sshd-session[216711]: Failed password for root from 193.46.255.99 port 61266 ssh2
Oct 07 21:53:20 compute-0 nova_compute[192716]: 2025-10-07 21:53:20.915 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:20 compute-0 nova_compute[192716]: 2025-10-07 21:53:20.916 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:20 compute-0 nova_compute[192716]: 2025-10-07 21:53:20.916 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:20 compute-0 nova_compute[192716]: 2025-10-07 21:53:20.917 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:53:20 compute-0 nova_compute[192716]: 2025-10-07 21:53:20.987 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:21 compute-0 unix_chkpwd[216741]: password check failed for user (root)
Oct 07 21:53:21 compute-0 podman[216742]: 2025-10-07 21:53:21.849063337 +0000 UTC m=+0.083671489 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 21:53:22 compute-0 nova_compute[192716]: 2025-10-07 21:53:22.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:22 compute-0 sshd-session[216711]: Failed password for root from 193.46.255.99 port 61266 ssh2
Oct 07 21:53:22 compute-0 nova_compute[192716]: 2025-10-07 21:53:22.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:23 compute-0 sshd-session[216711]: Received disconnect from 193.46.255.99 port 61266:11:  [preauth]
Oct 07 21:53:23 compute-0 sshd-session[216711]: Disconnected from authenticating user root 193.46.255.99 port 61266 [preauth]
Oct 07 21:53:23 compute-0 sshd-session[216711]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 07 21:53:23 compute-0 nova_compute[192716]: 2025-10-07 21:53:23.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:23 compute-0 nova_compute[192716]: 2025-10-07 21:53:23.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:24 compute-0 nova_compute[192716]: 2025-10-07 21:53:24.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:24 compute-0 unix_chkpwd[216763]: password check failed for user (root)
Oct 07 21:53:24 compute-0 sshd-session[216761]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 07 21:53:24 compute-0 nova_compute[192716]: 2025-10-07 21:53:24.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:53:25.603 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:53:25.604 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:53:25.607 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:26 compute-0 sshd-session[216761]: Failed password for root from 193.46.255.99 port 56406 ssh2
Oct 07 21:53:26 compute-0 sshd-session[216761]: Received disconnect from 193.46.255.99 port 56406:11:  [preauth]
Oct 07 21:53:26 compute-0 sshd-session[216761]: Disconnected from authenticating user root 193.46.255.99 port 56406 [preauth]
Oct 07 21:53:26 compute-0 podman[216765]: 2025-10-07 21:53:26.859998755 +0000 UTC m=+0.089648460 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 07 21:53:26 compute-0 nova_compute[192716]: 2025-10-07 21:53:26.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:53:27 compute-0 nova_compute[192716]: 2025-10-07 21:53:27.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:27 compute-0 nova_compute[192716]: 2025-10-07 21:53:27.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:27 compute-0 nova_compute[192716]: 2025-10-07 21:53:27.508 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:27 compute-0 nova_compute[192716]: 2025-10-07 21:53:27.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:27 compute-0 nova_compute[192716]: 2025-10-07 21:53:27.509 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.561 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.628 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.629 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.701 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.858 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.860 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.881 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.882 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.2787094116211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.882 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:28 compute-0 nova_compute[192716]: 2025-10-07 21:53:28.883 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:29 compute-0 nova_compute[192716]: 2025-10-07 21:53:29.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:29 compute-0 podman[203153]: time="2025-10-07T21:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:53:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:53:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Oct 07 21:53:29 compute-0 nova_compute[192716]: 2025-10-07 21:53:29.933 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 5138bd92-9a6e-4088-b0b2-bee3a14683ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:53:29 compute-0 nova_compute[192716]: 2025-10-07 21:53:29.934 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:53:29 compute-0 nova_compute[192716]: 2025-10-07 21:53:29.934 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:53:28 up  1:02,  0 user,  load average: 0.20, 0.20, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_42e6cb8a77b54158b2345b916b6fd79b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:53:29 compute-0 nova_compute[192716]: 2025-10-07 21:53:29.980 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:53:30 compute-0 nova_compute[192716]: 2025-10-07 21:53:30.486 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:53:30 compute-0 nova_compute[192716]: 2025-10-07 21:53:30.995 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:53:30 compute-0 nova_compute[192716]: 2025-10-07 21:53:30.996 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: ERROR   21:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: ERROR   21:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: ERROR   21:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: ERROR   21:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: ERROR   21:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:53:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:53:32 compute-0 nova_compute[192716]: 2025-10-07 21:53:32.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:34 compute-0 nova_compute[192716]: 2025-10-07 21:53:34.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:35 compute-0 nova_compute[192716]: 2025-10-07 21:53:35.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:53:35.088 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:53:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:53:35.089 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:53:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:53:36.090 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:53:37 compute-0 nova_compute[192716]: 2025-10-07 21:53:37.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:38 compute-0 podman[216797]: 2025-10-07 21:53:38.854500542 +0000 UTC m=+0.074781995 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:53:38 compute-0 podman[216798]: 2025-10-07 21:53:38.866936217 +0000 UTC m=+0.088361172 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 07 21:53:39 compute-0 nova_compute[192716]: 2025-10-07 21:53:39.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:42 compute-0 nova_compute[192716]: 2025-10-07 21:53:42.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:42 compute-0 podman[216837]: 2025-10-07 21:53:42.842938102 +0000 UTC m=+0.067149457 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:53:44 compute-0 nova_compute[192716]: 2025-10-07 21:53:44.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:47 compute-0 nova_compute[192716]: 2025-10-07 21:53:47.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:49 compute-0 nova_compute[192716]: 2025-10-07 21:53:49.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:49 compute-0 podman[216867]: 2025-10-07 21:53:49.898028194 +0000 UTC m=+0.129966460 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 21:53:52 compute-0 nova_compute[192716]: 2025-10-07 21:53:52.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:52 compute-0 nova_compute[192716]: 2025-10-07 21:53:52.760 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:52 compute-0 nova_compute[192716]: 2025-10-07 21:53:52.760 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:52 compute-0 podman[216893]: 2025-10-07 21:53:52.818514467 +0000 UTC m=+0.055744672 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 07 21:53:53 compute-0 nova_compute[192716]: 2025-10-07 21:53:53.265 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 21:53:53 compute-0 nova_compute[192716]: 2025-10-07 21:53:53.858 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:53 compute-0 nova_compute[192716]: 2025-10-07 21:53:53.859 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:53 compute-0 nova_compute[192716]: 2025-10-07 21:53:53.870 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 21:53:53 compute-0 nova_compute[192716]: 2025-10-07 21:53:53.871 2 INFO nova.compute.claims [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Claim successful on node compute-0.ctlplane.example.com
Oct 07 21:53:54 compute-0 nova_compute[192716]: 2025-10-07 21:53:54.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:54 compute-0 nova_compute[192716]: 2025-10-07 21:53:54.955 2 DEBUG nova.compute.provider_tree [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:53:55 compute-0 nova_compute[192716]: 2025-10-07 21:53:55.463 2 DEBUG nova.scheduler.client.report [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:53:55 compute-0 nova_compute[192716]: 2025-10-07 21:53:55.972 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:55 compute-0 nova_compute[192716]: 2025-10-07 21:53:55.973 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 21:53:56 compute-0 nova_compute[192716]: 2025-10-07 21:53:56.483 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 21:53:56 compute-0 nova_compute[192716]: 2025-10-07 21:53:56.484 2 DEBUG nova.network.neutron [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 21:53:56 compute-0 nova_compute[192716]: 2025-10-07 21:53:56.484 2 WARNING neutronclient.v2_0.client [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:53:56 compute-0 nova_compute[192716]: 2025-10-07 21:53:56.484 2 WARNING neutronclient.v2_0.client [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:53:56 compute-0 nova_compute[192716]: 2025-10-07 21:53:56.991 2 INFO nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 21:53:57 compute-0 nova_compute[192716]: 2025-10-07 21:53:57.210 2 DEBUG nova.network.neutron [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Successfully created port: c20e63df-b9ab-4daf-b7bb-502dff45fae0 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 21:53:57 compute-0 nova_compute[192716]: 2025-10-07 21:53:57.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:57 compute-0 nova_compute[192716]: 2025-10-07 21:53:57.516 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 21:53:57 compute-0 podman[216912]: 2025-10-07 21:53:57.864175096 +0000 UTC m=+0.095882918 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Oct 07 21:53:57 compute-0 nova_compute[192716]: 2025-10-07 21:53:57.916 2 DEBUG nova.network.neutron [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Successfully updated port: c20e63df-b9ab-4daf-b7bb-502dff45fae0 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.003 2 DEBUG nova.compute.manager [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-changed-c20e63df-b9ab-4daf-b7bb-502dff45fae0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.003 2 DEBUG nova.compute.manager [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Refreshing instance network info cache due to event network-changed-c20e63df-b9ab-4daf-b7bb-502dff45fae0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.003 2 DEBUG oslo_concurrency.lockutils [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-c0b3d97e-60fb-487c-90d3-2b48392ff09f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.004 2 DEBUG oslo_concurrency.lockutils [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-c0b3d97e-60fb-487c-90d3-2b48392ff09f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.004 2 DEBUG nova.network.neutron [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Refreshing network info cache for port c20e63df-b9ab-4daf-b7bb-502dff45fae0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.425 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "refresh_cache-c0b3d97e-60fb-487c-90d3-2b48392ff09f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.509 2 WARNING neutronclient.v2_0.client [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.535 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.537 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.538 2 INFO nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Creating image(s)
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.539 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "/var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.539 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "/var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.540 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "/var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.541 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.547 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.550 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.624 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.625 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.626 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.626 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.629 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.629 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.686 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.687 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.723 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.724 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.724 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.749 2 DEBUG nova.network.neutron [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.777 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.778 2 DEBUG nova.virt.disk.api [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Checking if we can resize image /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.778 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.833 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.834 2 DEBUG nova.virt.disk.api [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Cannot resize image /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.834 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.834 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Ensure instance console log exists: /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.835 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.835 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.835 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:53:58 compute-0 nova_compute[192716]: 2025-10-07 21:53:58.886 2 DEBUG nova.network.neutron [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:53:59 compute-0 nova_compute[192716]: 2025-10-07 21:53:59.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:53:59 compute-0 nova_compute[192716]: 2025-10-07 21:53:59.394 2 DEBUG oslo_concurrency.lockutils [req-cacc6b4b-f8fe-4583-944c-dc6901fb22b0 req-66d8e526-8e96-4137-8f3d-9bc6d3a03a1c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-c0b3d97e-60fb-487c-90d3-2b48392ff09f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:53:59 compute-0 nova_compute[192716]: 2025-10-07 21:53:59.395 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquired lock "refresh_cache-c0b3d97e-60fb-487c-90d3-2b48392ff09f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:53:59 compute-0 nova_compute[192716]: 2025-10-07 21:53:59.395 2 DEBUG nova.network.neutron [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:53:59 compute-0 podman[203153]: time="2025-10-07T21:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:53:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:53:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3463 "" "Go-http-client/1.1"
Oct 07 21:54:00 compute-0 nova_compute[192716]: 2025-10-07 21:54:00.773 2 DEBUG nova.network.neutron [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: ERROR   21:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: ERROR   21:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: ERROR   21:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: ERROR   21:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: ERROR   21:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:54:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:54:01 compute-0 nova_compute[192716]: 2025-10-07 21:54:01.743 2 WARNING neutronclient.v2_0.client [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:02 compute-0 unix_chkpwd[216952]: password check failed for user (root)
Oct 07 21:54:02 compute-0 sshd-session[216950]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=root
Oct 07 21:54:02 compute-0 nova_compute[192716]: 2025-10-07 21:54:02.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:02 compute-0 nova_compute[192716]: 2025-10-07 21:54:02.769 2 DEBUG nova.network.neutron [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Updating instance_info_cache with network_info: [{"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.286 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Releasing lock "refresh_cache-c0b3d97e-60fb-487c-90d3-2b48392ff09f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.287 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Instance network_info: |[{"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.291 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Start _get_guest_xml network_info=[{"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.297 2 WARNING nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.300 2 DEBUG nova.virt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-707264346', uuid='c0b3d97e-60fb-487c-90d3-2b48392ff09f'), owner=OwnerMeta(userid='b71b837a81994b9694ede764e0406ac8', username='tempest-TestExecuteActionsViaActuator-1409880739-project-admin', projectid='42e6cb8a77b54158b2345b916b6fd79b', projectname='tempest-TestExecuteActionsViaActuator-1409880739'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874043.3003716) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.307 2 DEBUG nova.virt.libvirt.host [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.308 2 DEBUG nova.virt.libvirt.host [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.313 2 DEBUG nova.virt.libvirt.host [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.313 2 DEBUG nova.virt.libvirt.host [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.313 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.313 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.314 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.314 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.314 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.314 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.315 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.315 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.315 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.315 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.315 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.316 2 DEBUG nova.virt.hardware [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.319 2 DEBUG nova.virt.libvirt.vif [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:53:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-707264346',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-707264346',id=7,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-0a4go6gs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:53:57Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=c0b3d97e-60fb-487c-90d3-2b48392ff09f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.319 2 DEBUG nova.network.os_vif_util [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.320 2 DEBUG nova.network.os_vif_util [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.321 2 DEBUG nova.objects.instance [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'pci_devices' on Instance uuid c0b3d97e-60fb-487c-90d3-2b48392ff09f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.829 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] End _get_guest_xml xml=<domain type="kvm">
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <uuid>c0b3d97e-60fb-487c-90d3-2b48392ff09f</uuid>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <name>instance-00000007</name>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-707264346</nova:name>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:54:03</nova:creationTime>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:54:03 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:54:03 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:user uuid="b71b837a81994b9694ede764e0406ac8">tempest-TestExecuteActionsViaActuator-1409880739-project-admin</nova:user>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:project uuid="42e6cb8a77b54158b2345b916b6fd79b">tempest-TestExecuteActionsViaActuator-1409880739</nova:project>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         <nova:port uuid="c20e63df-b9ab-4daf-b7bb-502dff45fae0">
Oct 07 21:54:03 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <system>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <entry name="serial">c0b3d97e-60fb-487c-90d3-2b48392ff09f</entry>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <entry name="uuid">c0b3d97e-60fb-487c-90d3-2b48392ff09f</entry>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </system>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <os>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </os>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <features>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </features>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.config"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:95:12:f3"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <target dev="tapc20e63df-b9"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/console.log" append="off"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <video>
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </video>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:54:03 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:54:03 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:54:03 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:54:03 compute-0 nova_compute[192716]: </domain>
Oct 07 21:54:03 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.830 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Preparing to wait for external event network-vif-plugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.831 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.831 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.831 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.832 2 DEBUG nova.virt.libvirt.vif [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:53:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-707264346',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-707264346',id=7,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-0a4go6gs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:53:57Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=c0b3d97e-60fb-487c-90d3-2b48392ff09f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.832 2 DEBUG nova.network.os_vif_util [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.833 2 DEBUG nova.network.os_vif_util [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.833 2 DEBUG os_vif [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.834 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.834 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.835 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '305e56f9-217e-5edd-b3d7-88d6908a67d6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc20e63df-b9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.844 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc20e63df-b9, col_values=(('qos', UUID('04a29024-fefd-4825-aeab-19dd0c6c0706')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.844 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc20e63df-b9, col_values=(('external_ids', {'iface-id': 'c20e63df-b9ab-4daf-b7bb-502dff45fae0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:12:f3', 'vm-uuid': 'c0b3d97e-60fb-487c-90d3-2b48392ff09f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:03 compute-0 NetworkManager[51722]: <info>  [1759874043.8474] manager: (tapc20e63df-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:03 compute-0 nova_compute[192716]: 2025-10-07 21:54:03.856 2 INFO os_vif [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9')
Oct 07 21:54:03 compute-0 sshd-session[216950]: Failed password for root from 116.110.151.5 port 51208 ssh2
Oct 07 21:54:04 compute-0 nova_compute[192716]: 2025-10-07 21:54:04.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:04 compute-0 sshd-session[216950]: Connection closed by authenticating user root 116.110.151.5 port 51208 [preauth]
Oct 07 21:54:04 compute-0 sshd-session[216787]: ssh_dispatch_run_fatal: Connection from 27.79.44.171 port 57400: Connection timed out [preauth]
Oct 07 21:54:05 compute-0 nova_compute[192716]: 2025-10-07 21:54:05.399 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:54:05 compute-0 nova_compute[192716]: 2025-10-07 21:54:05.400 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:54:05 compute-0 nova_compute[192716]: 2025-10-07 21:54:05.401 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No VIF found with MAC fa:16:3e:95:12:f3, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 21:54:05 compute-0 nova_compute[192716]: 2025-10-07 21:54:05.401 2 INFO nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Using config drive
Oct 07 21:54:05 compute-0 nova_compute[192716]: 2025-10-07 21:54:05.912 2 WARNING neutronclient.v2_0.client [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.075 2 INFO nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Creating config drive at /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.config
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.080 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpmb7kpj26 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.211 2 DEBUG oslo_concurrency.processutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpmb7kpj26" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:06 compute-0 kernel: tapc20e63df-b9: entered promiscuous mode
Oct 07 21:54:06 compute-0 NetworkManager[51722]: <info>  [1759874046.2822] manager: (tapc20e63df-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Oct 07 21:54:06 compute-0 ovn_controller[94904]: 2025-10-07T21:54:06Z|00054|binding|INFO|Claiming lport c20e63df-b9ab-4daf-b7bb-502dff45fae0 for this chassis.
Oct 07 21:54:06 compute-0 ovn_controller[94904]: 2025-10-07T21:54:06Z|00055|binding|INFO|c20e63df-b9ab-4daf-b7bb-502dff45fae0: Claiming fa:16:3e:95:12:f3 10.100.0.10
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.293 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:12:f3 10.100.0.10'], port_security=['fa:16:3e:95:12:f3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0b3d97e-60fb-487c-90d3-2b48392ff09f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=c20e63df-b9ab-4daf-b7bb-502dff45fae0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.295 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c20e63df-b9ab-4daf-b7bb-502dff45fae0 in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e bound to our chassis
Oct 07 21:54:06 compute-0 ovn_controller[94904]: 2025-10-07T21:54:06Z|00056|binding|INFO|Setting lport c20e63df-b9ab-4daf-b7bb-502dff45fae0 ovn-installed in OVS
Oct 07 21:54:06 compute-0 ovn_controller[94904]: 2025-10-07T21:54:06Z|00057|binding|INFO|Setting lport c20e63df-b9ab-4daf-b7bb-502dff45fae0 up in Southbound
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.297 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.318 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fa67f8ba-8002-4ebb-937b-1cb2c1dc11e6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 systemd-udevd[216975]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:54:06 compute-0 systemd-machined[152719]: New machine qemu-3-instance-00000007.
Oct 07 21:54:06 compute-0 NetworkManager[51722]: <info>  [1759874046.3528] device (tapc20e63df-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:54:06 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Oct 07 21:54:06 compute-0 NetworkManager[51722]: <info>  [1759874046.3542] device (tapc20e63df-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.360 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[ae79d337-0393-4a73-846b-1a836307b17e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.363 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[099b2044-8ed5-4f2e-bb73-56e86041209f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.408 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[a17703ec-3bcd-4eef-bf95-357dd934f986]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.437 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[48032cda-77f9-4906-810d-babe49169576]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216986, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.461 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a7e969-2bfa-4bb2-b94a-b2bddbd98a24]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216987, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216987, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.463 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.467 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.467 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.467 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.468 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:54:06 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:06.469 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3520de-aadf-4185-82e2-3552b1b658a4]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.916 2 DEBUG nova.compute.manager [req-2ef30993-8dac-488c-a0fc-7da90117f0fc req-292fba59-d8d6-40ea-88c2-11af0776c93b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-plugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.918 2 DEBUG oslo_concurrency.lockutils [req-2ef30993-8dac-488c-a0fc-7da90117f0fc req-292fba59-d8d6-40ea-88c2-11af0776c93b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.918 2 DEBUG oslo_concurrency.lockutils [req-2ef30993-8dac-488c-a0fc-7da90117f0fc req-292fba59-d8d6-40ea-88c2-11af0776c93b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.919 2 DEBUG oslo_concurrency.lockutils [req-2ef30993-8dac-488c-a0fc-7da90117f0fc req-292fba59-d8d6-40ea-88c2-11af0776c93b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:06 compute-0 nova_compute[192716]: 2025-10-07 21:54:06.919 2 DEBUG nova.compute.manager [req-2ef30993-8dac-488c-a0fc-7da90117f0fc req-292fba59-d8d6-40ea-88c2-11af0776c93b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Processing event network-vif-plugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.188 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.192 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.196 2 INFO nova.virt.libvirt.driver [-] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Instance spawned successfully.
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.197 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.710 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.711 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.711 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.711 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.712 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:54:07 compute-0 nova_compute[192716]: 2025-10-07 21:54:07.712 2 DEBUG nova.virt.libvirt.driver [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:54:08 compute-0 nova_compute[192716]: 2025-10-07 21:54:08.222 2 INFO nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Took 9.69 seconds to spawn the instance on the hypervisor.
Oct 07 21:54:08 compute-0 nova_compute[192716]: 2025-10-07 21:54:08.223 2 DEBUG nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 21:54:08 compute-0 nova_compute[192716]: 2025-10-07 21:54:08.757 2 INFO nova.compute.manager [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Took 14.98 seconds to build instance.
Oct 07 21:54:08 compute-0 nova_compute[192716]: 2025-10-07 21:54:08.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:08 compute-0 nova_compute[192716]: 2025-10-07 21:54:08.999 2 DEBUG nova.compute.manager [req-70b0377a-7c70-4335-b088-4c8b1bbe892a req-96f0486a-1ae0-4b40-830b-ed8bcf25983f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-plugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.000 2 DEBUG oslo_concurrency.lockutils [req-70b0377a-7c70-4335-b088-4c8b1bbe892a req-96f0486a-1ae0-4b40-830b-ed8bcf25983f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.000 2 DEBUG oslo_concurrency.lockutils [req-70b0377a-7c70-4335-b088-4c8b1bbe892a req-96f0486a-1ae0-4b40-830b-ed8bcf25983f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.000 2 DEBUG oslo_concurrency.lockutils [req-70b0377a-7c70-4335-b088-4c8b1bbe892a req-96f0486a-1ae0-4b40-830b-ed8bcf25983f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.001 2 DEBUG nova.compute.manager [req-70b0377a-7c70-4335-b088-4c8b1bbe892a req-96f0486a-1ae0-4b40-830b-ed8bcf25983f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] No waiting events found dispatching network-vif-plugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.001 2 WARNING nova.compute.manager [req-70b0377a-7c70-4335-b088-4c8b1bbe892a req-96f0486a-1ae0-4b40-830b-ed8bcf25983f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received unexpected event network-vif-plugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 for instance with vm_state active and task_state None.
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:09 compute-0 nova_compute[192716]: 2025-10-07 21:54:09.263 2 DEBUG oslo_concurrency.lockutils [None req-9f9acb34-9fcb-4909-ad63-ac6ba2e8ca18 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.503s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:09 compute-0 podman[216996]: 2025-10-07 21:54:09.874920109 +0000 UTC m=+0.098730689 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:54:09 compute-0 podman[216997]: 2025-10-07 21:54:09.879607072 +0000 UTC m=+0.099520011 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 21:54:13 compute-0 nova_compute[192716]: 2025-10-07 21:54:13.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:13 compute-0 podman[217035]: 2025-10-07 21:54:13.871349186 +0000 UTC m=+0.092538082 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:54:14 compute-0 nova_compute[192716]: 2025-10-07 21:54:14.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:18 compute-0 ovn_controller[94904]: 2025-10-07T21:54:18Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:12:f3 10.100.0.10
Oct 07 21:54:18 compute-0 ovn_controller[94904]: 2025-10-07T21:54:18Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:12:f3 10.100.0.10
Oct 07 21:54:18 compute-0 nova_compute[192716]: 2025-10-07 21:54:18.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:19 compute-0 nova_compute[192716]: 2025-10-07 21:54:19.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:20 compute-0 podman[217076]: 2025-10-07 21:54:20.896342599 +0000 UTC m=+0.125941075 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:54:21 compute-0 nova_compute[192716]: 2025-10-07 21:54:21.997 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:21 compute-0 nova_compute[192716]: 2025-10-07 21:54:21.997 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:21 compute-0 nova_compute[192716]: 2025-10-07 21:54:21.998 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:21 compute-0 nova_compute[192716]: 2025-10-07 21:54:21.998 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:54:22 compute-0 nova_compute[192716]: 2025-10-07 21:54:22.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:23 compute-0 nova_compute[192716]: 2025-10-07 21:54:23.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:23 compute-0 podman[217103]: 2025-10-07 21:54:23.871555445 +0000 UTC m=+0.089509446 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:54:24 compute-0 nova_compute[192716]: 2025-10-07 21:54:24.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:24 compute-0 nova_compute[192716]: 2025-10-07 21:54:24.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:25.608 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:25.609 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:25.609 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:25 compute-0 nova_compute[192716]: 2025-10-07 21:54:25.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:26 compute-0 nova_compute[192716]: 2025-10-07 21:54:26.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:26 compute-0 nova_compute[192716]: 2025-10-07 21:54:26.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:54:27 compute-0 nova_compute[192716]: 2025-10-07 21:54:27.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:27 compute-0 nova_compute[192716]: 2025-10-07 21:54:27.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:27 compute-0 nova_compute[192716]: 2025-10-07 21:54:27.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:27 compute-0 nova_compute[192716]: 2025-10-07 21:54:27.507 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.562 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.654 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.655 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.719 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.728 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.816 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.817 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:28 compute-0 podman[217131]: 2025-10-07 21:54:28.828719308 +0000 UTC m=+0.069794883 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.882 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:28 compute-0 nova_compute[192716]: 2025-10-07 21:54:28.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.057 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.058 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.088 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.089 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5517MB free_disk=73.24943161010742GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.089 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.089 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:29 compute-0 nova_compute[192716]: 2025-10-07 21:54:29.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:29 compute-0 podman[203153]: time="2025-10-07T21:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:54:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:54:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3460 "" "Go-http-client/1.1"
Oct 07 21:54:30 compute-0 nova_compute[192716]: 2025-10-07 21:54:30.152 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 5138bd92-9a6e-4088-b0b2-bee3a14683ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:54:30 compute-0 nova_compute[192716]: 2025-10-07 21:54:30.153 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance c0b3d97e-60fb-487c-90d3-2b48392ff09f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:54:30 compute-0 nova_compute[192716]: 2025-10-07 21:54:30.153 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:54:30 compute-0 nova_compute[192716]: 2025-10-07 21:54:30.153 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:54:29 up  1:03,  0 user,  load average: 0.26, 0.21, 0.36\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_42e6cb8a77b54158b2345b916b6fd79b': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:54:30 compute-0 nova_compute[192716]: 2025-10-07 21:54:30.194 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:54:30 compute-0 nova_compute[192716]: 2025-10-07 21:54:30.701 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:54:31 compute-0 nova_compute[192716]: 2025-10-07 21:54:31.211 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:54:31 compute-0 nova_compute[192716]: 2025-10-07 21:54:31.211 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: ERROR   21:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: ERROR   21:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: ERROR   21:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: ERROR   21:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: ERROR   21:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:54:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:54:33 compute-0 nova_compute[192716]: 2025-10-07 21:54:33.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:34 compute-0 nova_compute[192716]: 2025-10-07 21:54:34.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:35.400 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:54:35 compute-0 nova_compute[192716]: 2025-10-07 21:54:35.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:35.402 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:54:38 compute-0 nova_compute[192716]: 2025-10-07 21:54:38.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:39 compute-0 nova_compute[192716]: 2025-10-07 21:54:39.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:40 compute-0 podman[217160]: 2025-10-07 21:54:40.841243962 +0000 UTC m=+0.076686290 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 07 21:54:40 compute-0 podman[217161]: 2025-10-07 21:54:40.866700318 +0000 UTC m=+0.091067020 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 21:54:43 compute-0 nova_compute[192716]: 2025-10-07 21:54:43.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:44 compute-0 nova_compute[192716]: 2025-10-07 21:54:44.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:44.404 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:44 compute-0 nova_compute[192716]: 2025-10-07 21:54:44.533 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:44 compute-0 nova_compute[192716]: 2025-10-07 21:54:44.534 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:44 compute-0 podman[217198]: 2025-10-07 21:54:44.881712795 +0000 UTC m=+0.105385398 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:54:45 compute-0 nova_compute[192716]: 2025-10-07 21:54:45.040 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 21:54:45 compute-0 nova_compute[192716]: 2025-10-07 21:54:45.590 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:45 compute-0 nova_compute[192716]: 2025-10-07 21:54:45.590 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:45 compute-0 nova_compute[192716]: 2025-10-07 21:54:45.600 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 21:54:45 compute-0 nova_compute[192716]: 2025-10-07 21:54:45.601 2 INFO nova.compute.claims [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Claim successful on node compute-0.ctlplane.example.com
Oct 07 21:54:46 compute-0 nova_compute[192716]: 2025-10-07 21:54:46.719 2 DEBUG nova.compute.provider_tree [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:54:47 compute-0 nova_compute[192716]: 2025-10-07 21:54:47.227 2 DEBUG nova.scheduler.client.report [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:54:47 compute-0 nova_compute[192716]: 2025-10-07 21:54:47.752 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.161s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:47 compute-0 nova_compute[192716]: 2025-10-07 21:54:47.752 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 21:54:48 compute-0 nova_compute[192716]: 2025-10-07 21:54:48.266 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 21:54:48 compute-0 nova_compute[192716]: 2025-10-07 21:54:48.266 2 DEBUG nova.network.neutron [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 21:54:48 compute-0 nova_compute[192716]: 2025-10-07 21:54:48.267 2 WARNING neutronclient.v2_0.client [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:48 compute-0 nova_compute[192716]: 2025-10-07 21:54:48.268 2 WARNING neutronclient.v2_0.client [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:48 compute-0 nova_compute[192716]: 2025-10-07 21:54:48.778 2 INFO nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 21:54:48 compute-0 nova_compute[192716]: 2025-10-07 21:54:48.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:49 compute-0 nova_compute[192716]: 2025-10-07 21:54:49.226 2 DEBUG nova.network.neutron [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Successfully created port: ffef8458-72c0-4d1a-966e-e35470777c1a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 21:54:49 compute-0 nova_compute[192716]: 2025-10-07 21:54:49.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:49 compute-0 nova_compute[192716]: 2025-10-07 21:54:49.286 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.228 2 DEBUG nova.network.neutron [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Successfully updated port: ffef8458-72c0-4d1a-966e-e35470777c1a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.309 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.312 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.312 2 INFO nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Creating image(s)
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.313 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "/var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.314 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "/var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.315 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "/var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.316 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.323 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.328 2 DEBUG nova.compute.manager [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-changed-ffef8458-72c0-4d1a-966e-e35470777c1a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.329 2 DEBUG nova.compute.manager [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Refreshing instance network info cache due to event network-changed-ffef8458-72c0-4d1a-966e-e35470777c1a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.329 2 DEBUG oslo_concurrency.lockutils [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.329 2 DEBUG oslo_concurrency.lockutils [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.330 2 DEBUG nova.network.neutron [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Refreshing network info cache for port ffef8458-72c0-4d1a-966e-e35470777c1a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.332 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.425 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.427 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.428 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.429 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.436 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.436 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.533 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.535 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.572 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.573 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.573 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.633 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.634 2 DEBUG nova.virt.disk.api [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Checking if we can resize image /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.635 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.687 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.688 2 DEBUG nova.virt.disk.api [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Cannot resize image /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.689 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.689 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Ensure instance console log exists: /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.690 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.690 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.690 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.736 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "refresh_cache-6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:54:50 compute-0 nova_compute[192716]: 2025-10-07 21:54:50.848 2 WARNING neutronclient.v2_0.client [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:51 compute-0 nova_compute[192716]: 2025-10-07 21:54:51.783 2 DEBUG nova.network.neutron [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:54:51 compute-0 podman[217239]: 2025-10-07 21:54:51.899074611 +0000 UTC m=+0.128994272 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 21:54:52 compute-0 nova_compute[192716]: 2025-10-07 21:54:52.750 2 DEBUG nova.network.neutron [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:54:53 compute-0 nova_compute[192716]: 2025-10-07 21:54:53.257 2 DEBUG oslo_concurrency.lockutils [req-3a24e131-084d-4470-88f8-6ef3c2b0563d req-4219d80a-e791-4aa7-a06e-544b5910dce5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:54:53 compute-0 nova_compute[192716]: 2025-10-07 21:54:53.258 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquired lock "refresh_cache-6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:54:53 compute-0 nova_compute[192716]: 2025-10-07 21:54:53.258 2 DEBUG nova.network.neutron [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:54:53 compute-0 nova_compute[192716]: 2025-10-07 21:54:53.881 2 DEBUG nova.network.neutron [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:54:53 compute-0 nova_compute[192716]: 2025-10-07 21:54:53.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:54 compute-0 nova_compute[192716]: 2025-10-07 21:54:54.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:54 compute-0 nova_compute[192716]: 2025-10-07 21:54:54.651 2 WARNING neutronclient.v2_0.client [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:54 compute-0 podman[217267]: 2025-10-07 21:54:54.90377223 +0000 UTC m=+0.134080128 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 21:54:54 compute-0 nova_compute[192716]: 2025-10-07 21:54:54.906 2 DEBUG nova.network.neutron [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Updating instance_info_cache with network_info: [{"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.411 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Releasing lock "refresh_cache-6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.412 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Instance network_info: |[{"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.417 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Start _get_guest_xml network_info=[{"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.423 2 WARNING nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.425 2 DEBUG nova.virt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1048421019', uuid='6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8'), owner=OwnerMeta(userid='b71b837a81994b9694ede764e0406ac8', username='tempest-TestExecuteActionsViaActuator-1409880739-project-admin', projectid='42e6cb8a77b54158b2345b916b6fd79b', projectname='tempest-TestExecuteActionsViaActuator-1409880739'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874095.4257083) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.430 2 DEBUG nova.virt.libvirt.host [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.431 2 DEBUG nova.virt.libvirt.host [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.433 2 DEBUG nova.virt.libvirt.host [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.434 2 DEBUG nova.virt.libvirt.host [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.435 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.435 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.435 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.436 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.436 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.436 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.437 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.437 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.437 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.438 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.438 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.438 2 DEBUG nova.virt.hardware [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.443 2 DEBUG nova.virt.libvirt.vif [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:54:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1048421019',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1048421019',id=9,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-w2f000da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:54:49Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.443 2 DEBUG nova.network.os_vif_util [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.444 2 DEBUG nova.network.os_vif_util [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.445 2 DEBUG nova.objects.instance [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'pci_devices' on Instance uuid 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.953 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] End _get_guest_xml xml=<domain type="kvm">
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <uuid>6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8</uuid>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <name>instance-00000009</name>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1048421019</nova:name>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:54:55</nova:creationTime>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:54:55 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:54:55 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:user uuid="b71b837a81994b9694ede764e0406ac8">tempest-TestExecuteActionsViaActuator-1409880739-project-admin</nova:user>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:project uuid="42e6cb8a77b54158b2345b916b6fd79b">tempest-TestExecuteActionsViaActuator-1409880739</nova:project>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         <nova:port uuid="ffef8458-72c0-4d1a-966e-e35470777c1a">
Oct 07 21:54:55 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <system>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <entry name="serial">6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8</entry>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <entry name="uuid">6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8</entry>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </system>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <os>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </os>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <features>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </features>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.config"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:71:3a:79"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <target dev="tapffef8458-72"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/console.log" append="off"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <video>
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </video>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:54:55 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:54:55 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:54:55 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:54:55 compute-0 nova_compute[192716]: </domain>
Oct 07 21:54:55 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.955 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Preparing to wait for external event network-vif-plugged-ffef8458-72c0-4d1a-966e-e35470777c1a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.955 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.956 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.956 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.956 2 DEBUG nova.virt.libvirt.vif [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:54:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1048421019',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1048421019',id=9,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-w2f000da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:54:49Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.957 2 DEBUG nova.network.os_vif_util [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.957 2 DEBUG nova.network.os_vif_util [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.957 2 DEBUG os_vif [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.958 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.958 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.959 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4b2630a5-cf34-5dcc-9f79-1e4166663186', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.987 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffef8458-72, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.987 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapffef8458-72, col_values=(('qos', UUID('5ca33056-da1a-4715-b9e0-0c37c4506140')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.988 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapffef8458-72, col_values=(('external_ids', {'iface-id': 'ffef8458-72c0-4d1a-966e-e35470777c1a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:3a:79', 'vm-uuid': '6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 NetworkManager[51722]: <info>  [1759874095.9897] manager: (tapffef8458-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:55 compute-0 nova_compute[192716]: 2025-10-07 21:54:55.994 2 INFO os_vif [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72')
Oct 07 21:54:57 compute-0 nova_compute[192716]: 2025-10-07 21:54:57.533 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:54:57 compute-0 nova_compute[192716]: 2025-10-07 21:54:57.534 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:54:57 compute-0 nova_compute[192716]: 2025-10-07 21:54:57.535 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] No VIF found with MAC fa:16:3e:71:3a:79, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 21:54:57 compute-0 nova_compute[192716]: 2025-10-07 21:54:57.535 2 INFO nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Using config drive
Oct 07 21:54:58 compute-0 nova_compute[192716]: 2025-10-07 21:54:58.047 2 WARNING neutronclient.v2_0.client [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:54:58 compute-0 nova_compute[192716]: 2025-10-07 21:54:58.910 2 INFO nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Creating config drive at /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.config
Oct 07 21:54:58 compute-0 nova_compute[192716]: 2025-10-07 21:54:58.920 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpcn2ox2s6 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.062 2 DEBUG oslo_concurrency.processutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpcn2ox2s6" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:54:59 compute-0 kernel: tapffef8458-72: entered promiscuous mode
Oct 07 21:54:59 compute-0 NetworkManager[51722]: <info>  [1759874099.1599] manager: (tapffef8458-72): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Oct 07 21:54:59 compute-0 ovn_controller[94904]: 2025-10-07T21:54:59Z|00058|binding|INFO|Claiming lport ffef8458-72c0-4d1a-966e-e35470777c1a for this chassis.
Oct 07 21:54:59 compute-0 ovn_controller[94904]: 2025-10-07T21:54:59Z|00059|binding|INFO|ffef8458-72c0-4d1a-966e-e35470777c1a: Claiming fa:16:3e:71:3a:79 10.100.0.14
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.174 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:3a:79 10.100.0.14'], port_security=['fa:16:3e:71:3a:79 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=ffef8458-72c0-4d1a-966e-e35470777c1a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.176 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ffef8458-72c0-4d1a-966e-e35470777c1a in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e bound to our chassis
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.179 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:59 compute-0 systemd-udevd[217312]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:54:59 compute-0 ovn_controller[94904]: 2025-10-07T21:54:59Z|00060|binding|INFO|Setting lport ffef8458-72c0-4d1a-966e-e35470777c1a ovn-installed in OVS
Oct 07 21:54:59 compute-0 ovn_controller[94904]: 2025-10-07T21:54:59Z|00061|binding|INFO|Setting lport ffef8458-72c0-4d1a-966e-e35470777c1a up in Southbound
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:59 compute-0 NetworkManager[51722]: <info>  [1759874099.2073] device (tapffef8458-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:54:59 compute-0 NetworkManager[51722]: <info>  [1759874099.2084] device (tapffef8458-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.209 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[162d9842-fd98-4087-8dc0-cb80bf101e92]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 systemd-machined[152719]: New machine qemu-4-instance-00000009.
Oct 07 21:54:59 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.254 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ebe703-d7d1-4a4f-b816-9cf757c21318]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.257 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab16e8a-aed7-4c62-800c-f4f16e5e6539]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 podman[217299]: 2025-10-07 21:54:59.26797517 +0000 UTC m=+0.109644670 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, release=1755695350, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.301 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[55b025f3-3999-4853-b112-b28c3b62bdcd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.320 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab92fe8-1411-4b33-a288-4e6980298163]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217341, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.336 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6b6b79-e6cd-418e-ad6d-4fa7f47acd8c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217343, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217343, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.339 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.341 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.341 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.342 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.342 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:54:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:54:59.343 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ede52eb4-6522-4152-96c5-4c90e5af62e1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:54:59 compute-0 podman[203153]: time="2025-10-07T21:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:54:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:54:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.927 2 DEBUG nova.compute.manager [req-801f5beb-ba4a-4a4a-9ba1-4f831e266b39 req-8d5587d7-f71e-4318-9ce0-ade043d45f2c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-plugged-ffef8458-72c0-4d1a-966e-e35470777c1a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.928 2 DEBUG oslo_concurrency.lockutils [req-801f5beb-ba4a-4a4a-9ba1-4f831e266b39 req-8d5587d7-f71e-4318-9ce0-ade043d45f2c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.928 2 DEBUG oslo_concurrency.lockutils [req-801f5beb-ba4a-4a4a-9ba1-4f831e266b39 req-8d5587d7-f71e-4318-9ce0-ade043d45f2c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.928 2 DEBUG oslo_concurrency.lockutils [req-801f5beb-ba4a-4a4a-9ba1-4f831e266b39 req-8d5587d7-f71e-4318-9ce0-ade043d45f2c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:54:59 compute-0 nova_compute[192716]: 2025-10-07 21:54:59.929 2 DEBUG nova.compute.manager [req-801f5beb-ba4a-4a4a-9ba1-4f831e266b39 req-8d5587d7-f71e-4318-9ce0-ade043d45f2c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Processing event network-vif-plugged-ffef8458-72c0-4d1a-966e-e35470777c1a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 21:55:00 compute-0 nova_compute[192716]: 2025-10-07 21:55:00.669 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:55:00 compute-0 nova_compute[192716]: 2025-10-07 21:55:00.674 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 21:55:00 compute-0 nova_compute[192716]: 2025-10-07 21:55:00.678 2 INFO nova.virt.libvirt.driver [-] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Instance spawned successfully.
Oct 07 21:55:00 compute-0 nova_compute[192716]: 2025-10-07 21:55:00.679 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.202 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.202 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.203 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.204 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.204 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.205 2 DEBUG nova.virt.libvirt.driver [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: ERROR   21:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: ERROR   21:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: ERROR   21:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: ERROR   21:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: ERROR   21:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:55:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.720 2 INFO nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Took 11.41 seconds to spawn the instance on the hypervisor.
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.721 2 DEBUG nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.986 2 DEBUG nova.compute.manager [req-b11b4b76-345b-4f52-9643-ed64ccf20b1a req-09ff508e-deb4-4cf7-b574-5b563e95fe75 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-plugged-ffef8458-72c0-4d1a-966e-e35470777c1a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.987 2 DEBUG oslo_concurrency.lockutils [req-b11b4b76-345b-4f52-9643-ed64ccf20b1a req-09ff508e-deb4-4cf7-b574-5b563e95fe75 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.987 2 DEBUG oslo_concurrency.lockutils [req-b11b4b76-345b-4f52-9643-ed64ccf20b1a req-09ff508e-deb4-4cf7-b574-5b563e95fe75 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.988 2 DEBUG oslo_concurrency.lockutils [req-b11b4b76-345b-4f52-9643-ed64ccf20b1a req-09ff508e-deb4-4cf7-b574-5b563e95fe75 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.988 2 DEBUG nova.compute.manager [req-b11b4b76-345b-4f52-9643-ed64ccf20b1a req-09ff508e-deb4-4cf7-b574-5b563e95fe75 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] No waiting events found dispatching network-vif-plugged-ffef8458-72c0-4d1a-966e-e35470777c1a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:55:01 compute-0 nova_compute[192716]: 2025-10-07 21:55:01.989 2 WARNING nova.compute.manager [req-b11b4b76-345b-4f52-9643-ed64ccf20b1a req-09ff508e-deb4-4cf7-b574-5b563e95fe75 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received unexpected event network-vif-plugged-ffef8458-72c0-4d1a-966e-e35470777c1a for instance with vm_state active and task_state None.
Oct 07 21:55:02 compute-0 nova_compute[192716]: 2025-10-07 21:55:02.256 2 INFO nova.compute.manager [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Took 16.71 seconds to build instance.
Oct 07 21:55:02 compute-0 sshd-session[217351]: Invalid user kim from 116.110.151.5 port 38898
Oct 07 21:55:02 compute-0 nova_compute[192716]: 2025-10-07 21:55:02.765 2 DEBUG oslo_concurrency.lockutils [None req-a1af62aa-bc75-49ff-b5f7-d79dd6ec36a1 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.232s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:02 compute-0 sshd-session[217351]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:55:02 compute-0 sshd-session[217351]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:55:04 compute-0 nova_compute[192716]: 2025-10-07 21:55:04.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:04 compute-0 sshd-session[217351]: Failed password for invalid user kim from 116.110.151.5 port 38898 ssh2
Oct 07 21:55:05 compute-0 sshd-session[217351]: Connection closed by invalid user kim 116.110.151.5 port 38898 [preauth]
Oct 07 21:55:06 compute-0 nova_compute[192716]: 2025-10-07 21:55:06.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:07 compute-0 sshd-session[217353]: Invalid user helpdesk from 116.110.151.5 port 55990
Oct 07 21:55:07 compute-0 sshd-session[217353]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:55:07 compute-0 sshd-session[217353]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:55:08 compute-0 sshd-session[217353]: Failed password for invalid user helpdesk from 116.110.151.5 port 55990 ssh2
Oct 07 21:55:09 compute-0 nova_compute[192716]: 2025-10-07 21:55:09.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:10 compute-0 sshd-session[217353]: Connection closed by invalid user helpdesk 116.110.151.5 port 55990 [preauth]
Oct 07 21:55:11 compute-0 nova_compute[192716]: 2025-10-07 21:55:11.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:11 compute-0 ovn_controller[94904]: 2025-10-07T21:55:11Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:3a:79 10.100.0.14
Oct 07 21:55:11 compute-0 ovn_controller[94904]: 2025-10-07T21:55:11Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:3a:79 10.100.0.14
Oct 07 21:55:11 compute-0 podman[217368]: 2025-10-07 21:55:11.85710671 +0000 UTC m=+0.076142354 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 21:55:11 compute-0 podman[217369]: 2025-10-07 21:55:11.865002405 +0000 UTC m=+0.081186478 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 21:55:14 compute-0 nova_compute[192716]: 2025-10-07 21:55:14.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:15 compute-0 nova_compute[192716]: 2025-10-07 21:55:15.096 2 DEBUG nova.compute.manager [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6169
Oct 07 21:55:15 compute-0 nova_compute[192716]: 2025-10-07 21:55:15.642 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:15 compute-0 nova_compute[192716]: 2025-10-07 21:55:15.643 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:15 compute-0 podman[217405]: 2025-10-07 21:55:15.868844934 +0000 UTC m=+0.101356154 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 21:55:16 compute-0 nova_compute[192716]: 2025-10-07 21:55:16.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:16 compute-0 nova_compute[192716]: 2025-10-07 21:55:16.167 2 DEBUG nova.objects.instance [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'pci_requests' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:16 compute-0 nova_compute[192716]: 2025-10-07 21:55:16.816 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 21:55:16 compute-0 nova_compute[192716]: 2025-10-07 21:55:16.817 2 INFO nova.compute.claims [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Claim successful on node compute-0.ctlplane.example.com
Oct 07 21:55:16 compute-0 nova_compute[192716]: 2025-10-07 21:55:16.818 2 DEBUG nova.objects.instance [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'resources' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.074 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Creating tmpfile /var/lib/nova/instances/tmpom2vzu2c to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.076 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.244 2 DEBUG nova.compute.manager [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpom2vzu2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.269 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.270 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.329 2 DEBUG nova.objects.base [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<b581f70a-01a7-4dcb-a224-b1a4b738aab4> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.330 2 DEBUG nova.objects.instance [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'numa_topology' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.776 2 INFO nova.compute.rpcapi [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.778 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.836 2 DEBUG nova.objects.base [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<b581f70a-01a7-4dcb-a224-b1a4b738aab4> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 21:55:17 compute-0 nova_compute[192716]: 2025-10-07 21:55:17.837 2 DEBUG nova.objects.instance [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'pci_devices' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:18 compute-0 nova_compute[192716]: 2025-10-07 21:55:18.345 2 DEBUG nova.objects.base [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<b581f70a-01a7-4dcb-a224-b1a4b738aab4> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 21:55:18 compute-0 nova_compute[192716]: 2025-10-07 21:55:18.901 2 INFO nova.compute.resource_tracker [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating resource usage from migration 1c50a70a-6da7-4b03-9692-19a70065ff28
Oct 07 21:55:18 compute-0 nova_compute[192716]: 2025-10-07 21:55:18.902 2 DEBUG nova.compute.resource_tracker [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Starting to track incoming migration 1c50a70a-6da7-4b03-9692-19a70065ff28 with flavor e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 21:55:19 compute-0 nova_compute[192716]: 2025-10-07 21:55:19.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:19 compute-0 nova_compute[192716]: 2025-10-07 21:55:19.580 2 DEBUG nova.compute.provider_tree [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:55:19 compute-0 nova_compute[192716]: 2025-10-07 21:55:19.791 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:20 compute-0 nova_compute[192716]: 2025-10-07 21:55:20.095 2 DEBUG nova.scheduler.client.report [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:55:20 compute-0 nova_compute[192716]: 2025-10-07 21:55:20.605 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.962s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:20 compute-0 nova_compute[192716]: 2025-10-07 21:55:20.606 2 INFO nova.compute.manager [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Migrating
Oct 07 21:55:21 compute-0 nova_compute[192716]: 2025-10-07 21:55:21.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:22 compute-0 podman[217431]: 2025-10-07 21:55:22.869213235 +0000 UTC m=+0.107257628 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 07 21:55:23 compute-0 nova_compute[192716]: 2025-10-07 21:55:23.559 2 DEBUG nova.compute.manager [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpom2vzu2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3407f49b-7e6b-4ff7-8ade-5caf647a9bd4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.211 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.212 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.212 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.213 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.213 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:24 compute-0 sshd-session[217457]: Accepted publickey for nova from 192.168.122.101 port 41516 ssh2: ECDSA SHA256:iDJLkLV2qm60EYShWQ+1ijfKpIG1clzEi4IXirRVpJQ
Oct 07 21:55:24 compute-0 systemd-logind[798]: New session 29 of user nova.
Oct 07 21:55:24 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 07 21:55:24 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.576 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.576 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:55:24 compute-0 nova_compute[192716]: 2025-10-07 21:55:24.576 2 DEBUG nova.network.neutron [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:55:24 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 07 21:55:24 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 07 21:55:24 compute-0 systemd[217461]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 07 21:55:24 compute-0 systemd[217461]: Queued start job for default target Main User Target.
Oct 07 21:55:24 compute-0 systemd[217461]: Created slice User Application Slice.
Oct 07 21:55:24 compute-0 systemd[217461]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 07 21:55:24 compute-0 systemd[217461]: Started Daily Cleanup of User's Temporary Directories.
Oct 07 21:55:24 compute-0 systemd[217461]: Reached target Paths.
Oct 07 21:55:24 compute-0 systemd[217461]: Reached target Timers.
Oct 07 21:55:24 compute-0 systemd[217461]: Starting D-Bus User Message Bus Socket...
Oct 07 21:55:24 compute-0 systemd[217461]: Starting Create User's Volatile Files and Directories...
Oct 07 21:55:24 compute-0 systemd[217461]: Listening on D-Bus User Message Bus Socket.
Oct 07 21:55:24 compute-0 systemd[217461]: Reached target Sockets.
Oct 07 21:55:24 compute-0 systemd[217461]: Finished Create User's Volatile Files and Directories.
Oct 07 21:55:24 compute-0 systemd[217461]: Reached target Basic System.
Oct 07 21:55:24 compute-0 systemd[217461]: Reached target Main User Target.
Oct 07 21:55:24 compute-0 systemd[217461]: Startup finished in 155ms.
Oct 07 21:55:24 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 07 21:55:24 compute-0 systemd[1]: Started Session 29 of User nova.
Oct 07 21:55:24 compute-0 sshd-session[217457]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 07 21:55:24 compute-0 sshd-session[217476]: Received disconnect from 192.168.122.101 port 41516:11: disconnected by user
Oct 07 21:55:24 compute-0 sshd-session[217476]: Disconnected from user nova 192.168.122.101 port 41516
Oct 07 21:55:24 compute-0 sshd-session[217457]: pam_unix(sshd:session): session closed for user nova
Oct 07 21:55:24 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Oct 07 21:55:24 compute-0 systemd-logind[798]: Session 29 logged out. Waiting for processes to exit.
Oct 07 21:55:24 compute-0 systemd-logind[798]: Removed session 29.
Oct 07 21:55:25 compute-0 sshd-session[217478]: Accepted publickey for nova from 192.168.122.101 port 41532 ssh2: ECDSA SHA256:iDJLkLV2qm60EYShWQ+1ijfKpIG1clzEi4IXirRVpJQ
Oct 07 21:55:25 compute-0 systemd[1]: Started Session 31 of User nova.
Oct 07 21:55:25 compute-0 systemd-logind[798]: New session 31 of user nova.
Oct 07 21:55:25 compute-0 sshd-session[217478]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 07 21:55:25 compute-0 nova_compute[192716]: 2025-10-07 21:55:25.083 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:25 compute-0 podman[217480]: 2025-10-07 21:55:25.134189137 +0000 UTC m=+0.098771846 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 07 21:55:25 compute-0 sshd-session[217489]: Received disconnect from 192.168.122.101 port 41532:11: disconnected by user
Oct 07 21:55:25 compute-0 sshd-session[217489]: Disconnected from user nova 192.168.122.101 port 41532
Oct 07 21:55:25 compute-0 sshd-session[217478]: pam_unix(sshd:session): session closed for user nova
Oct 07 21:55:25 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Oct 07 21:55:25 compute-0 systemd-logind[798]: Session 31 logged out. Waiting for processes to exit.
Oct 07 21:55:25 compute-0 systemd-logind[798]: Removed session 31.
Oct 07 21:55:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:25.611 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:25.611 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:25.614 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:25 compute-0 nova_compute[192716]: 2025-10-07 21:55:25.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:26 compute-0 nova_compute[192716]: 2025-10-07 21:55:26.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:26 compute-0 nova_compute[192716]: 2025-10-07 21:55:26.077 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:26 compute-0 nova_compute[192716]: 2025-10-07 21:55:26.793 2 DEBUG nova.network.neutron [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Updating instance_info_cache with network_info: [{"id": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "address": "fa:16:3e:28:d9:17", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa229ff5f-bd", "ovs_interfaceid": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:55:26 compute-0 nova_compute[192716]: 2025-10-07 21:55:26.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.302 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.320 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpom2vzu2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3407f49b-7e6b-4ff7-8ade-5caf647a9bd4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.321 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Creating instance directory: /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.322 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Creating disk.info with the contents: {'/var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk': 'qcow2', '/var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.323 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.324 2 DEBUG nova.objects.instance [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.500 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.501 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.501 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.833 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.839 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.842 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.916 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.917 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.919 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.919 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.925 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.926 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.963 2 DEBUG nova.compute.manager [req-64addd5e-4fb3-415b-bf84-6dc65a17eaa6 req-97cd2af9-20e0-45bf-990c-5f2ab338d58a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.963 2 DEBUG oslo_concurrency.lockutils [req-64addd5e-4fb3-415b-bf84-6dc65a17eaa6 req-97cd2af9-20e0-45bf-990c-5f2ab338d58a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.964 2 DEBUG oslo_concurrency.lockutils [req-64addd5e-4fb3-415b-bf84-6dc65a17eaa6 req-97cd2af9-20e0-45bf-990c-5f2ab338d58a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.964 2 DEBUG oslo_concurrency.lockutils [req-64addd5e-4fb3-415b-bf84-6dc65a17eaa6 req-97cd2af9-20e0-45bf-990c-5f2ab338d58a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.965 2 DEBUG nova.compute.manager [req-64addd5e-4fb3-415b-bf84-6dc65a17eaa6 req-97cd2af9-20e0-45bf-990c-5f2ab338d58a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] No waiting events found dispatching network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.965 2 WARNING nova.compute.manager [req-64addd5e-4fb3-415b-bf84-6dc65a17eaa6 req-97cd2af9-20e0-45bf-990c-5f2ab338d58a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received unexpected event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for instance with vm_state active and task_state resize_migrating.
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.993 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:27 compute-0 nova_compute[192716]: 2025-10-07 21:55:27.994 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.017 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.018 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.018 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.019 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.040 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.041 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.042 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.104 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.105 2 DEBUG nova.virt.disk.api [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.105 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.161 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.163 2 DEBUG nova.virt.disk.api [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.163 2 DEBUG nova.objects.instance [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:28 compute-0 sshd-session[217521]: Accepted publickey for nova from 192.168.122.101 port 41548 ssh2: ECDSA SHA256:iDJLkLV2qm60EYShWQ+1ijfKpIG1clzEi4IXirRVpJQ
Oct 07 21:55:28 compute-0 systemd-logind[798]: New session 32 of user nova.
Oct 07 21:55:28 compute-0 systemd[1]: Started Session 32 of User nova.
Oct 07 21:55:28 compute-0 sshd-session[217521]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.671 2 DEBUG nova.objects.base [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<3407f49b-7e6b-4ff7-8ade-5caf647a9bd4> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.672 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.721 2 DEBUG oslo_concurrency.processutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4/disk.config 497664" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.723 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.725 2 DEBUG nova.virt.libvirt.vif [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T21:53:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1675061367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1675061367',id=6,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:53:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-irf06lw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:53:47Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=3407f49b-7e6b-4ff7-8ade-5caf647a9bd4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "address": "fa:16:3e:28:d9:17", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa229ff5f-bd", "ovs_interfaceid": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.726 2 DEBUG nova.network.os_vif_util [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "address": "fa:16:3e:28:d9:17", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa229ff5f-bd", "ovs_interfaceid": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.727 2 DEBUG nova.network.os_vif_util [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:17,bridge_name='br-int',has_traffic_filtering=True,id=a229ff5f-bd97-4beb-90ef-746057f7bbee,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa229ff5f-bd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.728 2 DEBUG os_vif [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:17,bridge_name='br-int',has_traffic_filtering=True,id=a229ff5f-bd97-4beb-90ef-746057f7bbee,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa229ff5f-bd') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.730 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.731 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.733 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '179bdfc6-32db-5a8c-93d4-3c8cc85444f3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa229ff5f-bd, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa229ff5f-bd, col_values=(('qos', UUID('db9f03c6-a578-4457-abbf-c54a3865eb29')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa229ff5f-bd, col_values=(('external_ids', {'iface-id': 'a229ff5f-bd97-4beb-90ef-746057f7bbee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:d9:17', 'vm-uuid': '3407f49b-7e6b-4ff7-8ade-5caf647a9bd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:28 compute-0 NetworkManager[51722]: <info>  [1759874128.7470] manager: (tapa229ff5f-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.760 2 INFO os_vif [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:17,bridge_name='br-int',has_traffic_filtering=True,id=a229ff5f-bd97-4beb-90ef-746057f7bbee,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa229ff5f-bd')
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.761 2 DEBUG nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.761 2 DEBUG nova.compute.manager [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpom2vzu2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3407f49b-7e6b-4ff7-8ade-5caf647a9bd4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 21:55:28 compute-0 nova_compute[192716]: 2025-10-07 21:55:28.762 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:29 compute-0 sshd-session[217524]: Received disconnect from 192.168.122.101 port 41548:11: disconnected by user
Oct 07 21:55:29 compute-0 sshd-session[217524]: Disconnected from user nova 192.168.122.101 port 41548
Oct 07 21:55:29 compute-0 sshd-session[217521]: pam_unix(sshd:session): session closed for user nova
Oct 07 21:55:29 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Oct 07 21:55:29 compute-0 systemd-logind[798]: Session 32 logged out. Waiting for processes to exit.
Oct 07 21:55:29 compute-0 systemd-logind[798]: Removed session 32.
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.092 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.156 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.157 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.224 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 sshd-session[217533]: Accepted publickey for nova from 192.168.122.101 port 41564 ssh2: ECDSA SHA256:iDJLkLV2qm60EYShWQ+1ijfKpIG1clzEi4IXirRVpJQ
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.233 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 systemd-logind[798]: New session 33 of user nova.
Oct 07 21:55:29 compute-0 systemd[1]: Started Session 33 of User nova.
Oct 07 21:55:29 compute-0 sshd-session[217533]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.302 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.303 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 sshd-session[217542]: Received disconnect from 192.168.122.101 port 41564:11: disconnected by user
Oct 07 21:55:29 compute-0 sshd-session[217542]: Disconnected from user nova 192.168.122.101 port 41564
Oct 07 21:55:29 compute-0 sshd-session[217533]: pam_unix(sshd:session): session closed for user nova
Oct 07 21:55:29 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Oct 07 21:55:29 compute-0 systemd-logind[798]: Session 33 logged out. Waiting for processes to exit.
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.361 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.368 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 systemd-logind[798]: Removed session 33.
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.447 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.448 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 podman[217549]: 2025-10-07 21:55:29.476169196 +0000 UTC m=+0.083968483 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.507 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 sshd-session[217551]: Accepted publickey for nova from 192.168.122.101 port 41576 ssh2: ECDSA SHA256:iDJLkLV2qm60EYShWQ+1ijfKpIG1clzEi4IXirRVpJQ
Oct 07 21:55:29 compute-0 systemd-logind[798]: New session 34 of user nova.
Oct 07 21:55:29 compute-0 systemd[1]: Started Session 34 of User nova.
Oct 07 21:55:29 compute-0 sshd-session[217551]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 07 21:55:29 compute-0 sshd-session[217580]: Received disconnect from 192.168.122.101 port 41576:11: disconnected by user
Oct 07 21:55:29 compute-0 sshd-session[217580]: Disconnected from user nova 192.168.122.101 port 41576
Oct 07 21:55:29 compute-0 sshd-session[217551]: pam_unix(sshd:session): session closed for user nova
Oct 07 21:55:29 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Oct 07 21:55:29 compute-0 systemd-logind[798]: Session 34 logged out. Waiting for processes to exit.
Oct 07 21:55:29 compute-0 systemd-logind[798]: Removed session 34.
Oct 07 21:55:29 compute-0 podman[203153]: time="2025-10-07T21:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:55:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:55:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3464 "" "Go-http-client/1.1"
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.767 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.768 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.797 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.801 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.801 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5355MB free_disk=73.22014999389648GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.802 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:29 compute-0 nova_compute[192716]: 2025-10-07 21:55:29.802 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.025 2 DEBUG nova.compute.manager [req-2193e2c5-a36b-4168-b748-b2bc6ed1efaa req-d4cdd356-c622-4c05-8a16-9c8695e1f16e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.026 2 DEBUG oslo_concurrency.lockutils [req-2193e2c5-a36b-4168-b748-b2bc6ed1efaa req-d4cdd356-c622-4c05-8a16-9c8695e1f16e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.026 2 DEBUG oslo_concurrency.lockutils [req-2193e2c5-a36b-4168-b748-b2bc6ed1efaa req-d4cdd356-c622-4c05-8a16-9c8695e1f16e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.027 2 DEBUG oslo_concurrency.lockutils [req-2193e2c5-a36b-4168-b748-b2bc6ed1efaa req-d4cdd356-c622-4c05-8a16-9c8695e1f16e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.028 2 DEBUG nova.compute.manager [req-2193e2c5-a36b-4168-b748-b2bc6ed1efaa req-d4cdd356-c622-4c05-8a16-9c8695e1f16e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] No waiting events found dispatching network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.028 2 WARNING nova.compute.manager [req-2193e2c5-a36b-4168-b748-b2bc6ed1efaa req-d4cdd356-c622-4c05-8a16-9c8695e1f16e 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received unexpected event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for instance with vm_state active and task_state resize_migrating.
Oct 07 21:55:30 compute-0 unix_chkpwd[217583]: password check failed for user (bin)
Oct 07 21:55:30 compute-0 sshd-session[217506]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5  user=bin
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.832 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration for instance b581f70a-01a7-4dcb-a224-b1a4b738aab4 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 21:55:30 compute-0 nova_compute[192716]: 2025-10-07 21:55:30.833 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration for instance 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: ERROR   21:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: ERROR   21:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: ERROR   21:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: ERROR   21:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: ERROR   21:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:55:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:55:31 compute-0 nova_compute[192716]: 2025-10-07 21:55:31.851 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating resource usage from migration 1c50a70a-6da7-4b03-9692-19a70065ff28
Oct 07 21:55:31 compute-0 nova_compute[192716]: 2025-10-07 21:55:31.852 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Starting to track incoming migration 1c50a70a-6da7-4b03-9692-19a70065ff28 with flavor e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 21:55:31 compute-0 nova_compute[192716]: 2025-10-07 21:55:31.852 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Updating resource usage from migration 156e5e3e-a451-4625-a4a8-8e2c152a8768
Oct 07 21:55:31 compute-0 nova_compute[192716]: 2025-10-07 21:55:31.853 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Starting to track incoming migration 156e5e3e-a451-4625-a4a8-8e2c152a8768 with flavor e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 21:55:31 compute-0 nova_compute[192716]: 2025-10-07 21:55:31.891 2 WARNING neutronclient.v2_0.client [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:32 compute-0 nova_compute[192716]: 2025-10-07 21:55:32.406 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 5138bd92-9a6e-4088-b0b2-bee3a14683ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:55:32 compute-0 nova_compute[192716]: 2025-10-07 21:55:32.406 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance c0b3d97e-60fb-487c-90d3-2b48392ff09f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:55:32 compute-0 nova_compute[192716]: 2025-10-07 21:55:32.407 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:55:32 compute-0 sshd-session[217506]: Failed password for bin from 116.110.151.5 port 33902 ssh2
Oct 07 21:55:32 compute-0 nova_compute[192716]: 2025-10-07 21:55:32.794 2 INFO nova.network.neutron [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating port 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Oct 07 21:55:32 compute-0 nova_compute[192716]: 2025-10-07 21:55:32.913 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Instance with task_state "resize_migrated" is not being actively managed by this compute host but has allocations referencing this compute node (19d1aa8e-e3fb-43ab-9849-122569e48a32): {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocations during the task state transition. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1813
Oct 07 21:55:33 compute-0 sshd-session[217506]: Connection closed by authenticating user bin 116.110.151.5 port 33902 [preauth]
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.422 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.423 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.424 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:55:29 up  1:04,  0 user,  load average: 0.32, 0.23, 0.35\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_42e6cb8a77b54158b2345b916b6fd79b': '3', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:55:33 compute-0 sshd-session[217584]: Invalid user admin from 116.110.151.5 port 33910
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.572 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:33 compute-0 sshd-session[217584]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:55:33 compute-0 sshd-session[217584]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.864 2 DEBUG nova.network.neutron [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Port a229ff5f-bd97-4beb-90ef-746057f7bbee updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 21:55:33 compute-0 nova_compute[192716]: 2025-10-07 21:55:33.875 2 DEBUG nova.compute.manager [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpom2vzu2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3407f49b-7e6b-4ff7-8ade-5caf647a9bd4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.082 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.238 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-b581f70a-01a7-4dcb-a224-b1a4b738aab4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.240 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-b581f70a-01a7-4dcb-a224-b1a4b738aab4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.240 2 DEBUG nova.network.neutron [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.324 2 DEBUG nova.compute.manager [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-changed-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.324 2 DEBUG nova.compute.manager [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Refreshing instance network info cache due to event network-changed-6ece1b44-4312-4bd7-9ca7-9592ec9faf78. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.325 2 DEBUG oslo_concurrency.lockutils [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-b581f70a-01a7-4dcb-a224-b1a4b738aab4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.594 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.595 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.793s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:34 compute-0 nova_compute[192716]: 2025-10-07 21:55:34.749 2 WARNING neutronclient.v2_0.client [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:35 compute-0 nova_compute[192716]: 2025-10-07 21:55:35.511 2 WARNING neutronclient.v2_0.client [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:35 compute-0 nova_compute[192716]: 2025-10-07 21:55:35.708 2 DEBUG nova.network.neutron [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating instance_info_cache with network_info: [{"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:55:36 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 21:55:36 compute-0 sshd-session[217584]: Failed password for invalid user admin from 116.110.151.5 port 33910 ssh2
Oct 07 21:55:36 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.215 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-b581f70a-01a7-4dcb-a224-b1a4b738aab4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.221 2 DEBUG oslo_concurrency.lockutils [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-b581f70a-01a7-4dcb-a224-b1a4b738aab4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.221 2 DEBUG nova.network.neutron [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Refreshing network info cache for port 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:55:36 compute-0 kernel: tapa229ff5f-bd: entered promiscuous mode
Oct 07 21:55:36 compute-0 NetworkManager[51722]: <info>  [1759874136.4271] manager: (tapa229ff5f-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Oct 07 21:55:36 compute-0 ovn_controller[94904]: 2025-10-07T21:55:36Z|00062|binding|INFO|Claiming lport a229ff5f-bd97-4beb-90ef-746057f7bbee for this additional chassis.
Oct 07 21:55:36 compute-0 ovn_controller[94904]: 2025-10-07T21:55:36Z|00063|binding|INFO|a229ff5f-bd97-4beb-90ef-746057f7bbee: Claiming fa:16:3e:28:d9:17 10.100.0.3
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.437 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:d9:17 10.100.0.3'], port_security=['fa:16:3e:28:d9:17 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3407f49b-7e6b-4ff7-8ade-5caf647a9bd4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=a229ff5f-bd97-4beb-90ef-746057f7bbee) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.437 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a229ff5f-bd97-4beb-90ef-746057f7bbee in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.440 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:36 compute-0 ovn_controller[94904]: 2025-10-07T21:55:36Z|00064|binding|INFO|Setting lport a229ff5f-bd97-4beb-90ef-746057f7bbee ovn-installed in OVS
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.466 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[cceae26e-8c4b-465f-af01-40f6623f8118]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 systemd-machined[152719]: New machine qemu-5-instance-00000006.
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.507 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6186ac-1059-435d-9636-bb1276d42187]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000006.
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.512 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9f41c01d-3b77-4b37-a36d-985466d84295]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 systemd-udevd[217621]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:55:36 compute-0 NetworkManager[51722]: <info>  [1759874136.5388] device (tapa229ff5f-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:55:36 compute-0 NetworkManager[51722]: <info>  [1759874136.5406] device (tapa229ff5f-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.568 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[811aebec-92dd-401e-b6bb-179efff212e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.597 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1013b57c-d854-4728-91bd-b3f787a7b181]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 9, 'rx_bytes': 1084, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 9, 'rx_bytes': 1084, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217631, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.621 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[308603b2-5265-479d-b8d7-85fa941dfb43]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217632, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217632, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.623 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.629 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.629 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.630 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.630 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:36.632 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b51b2ca3-2852-4469-bb8e-d12b1c14da83]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.727 2 WARNING neutronclient.v2_0.client [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.762 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.764 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.765 2 INFO nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Creating image(s)
Oct 07 21:55:36 compute-0 nova_compute[192716]: 2025-10-07 21:55:36.766 2 DEBUG nova.objects.instance [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.150 2 WARNING neutronclient.v2_0.client [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.272 2 DEBUG oslo_concurrency.processutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.343 2 DEBUG oslo_concurrency.processutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.344 2 DEBUG nova.virt.disk.api [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.344 2 DEBUG oslo_concurrency.processutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.421 2 DEBUG oslo_concurrency.processutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.421 2 DEBUG nova.virt.disk.api [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.895 2 DEBUG nova.network.neutron [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updated VIF entry in instance network info cache for port 6ece1b44-4312-4bd7-9ca7-9592ec9faf78. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.896 2 DEBUG nova.network.neutron [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating instance_info_cache with network_info: [{"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.929 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.930 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Ensure instance console log exists: /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.931 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.932 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.932 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.938 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Start _get_guest_xml network_info=[{"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "vif_mac": "fa:16:3e:de:04:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.944 2 WARNING nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.948 2 DEBUG nova.virt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-210535964', uuid='b581f70a-01a7-4dcb-a224-b1a4b738aab4'), owner=OwnerMeta(userid='b71b837a81994b9694ede764e0406ac8', username='tempest-TestExecuteActionsViaActuator-1409880739-project-admin', projectid='42e6cb8a77b54158b2345b916b6fd79b', projectname='tempest-TestExecuteActionsViaActuator-1409880739'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "vif_mac": "fa:16:3e:de:04:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874137.9480577) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.955 2 DEBUG nova.virt.libvirt.host [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.957 2 DEBUG nova.virt.libvirt.host [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.961 2 DEBUG nova.virt.libvirt.host [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.962 2 DEBUG nova.virt.libvirt.host [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.963 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.964 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.965 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.965 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.966 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.967 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.967 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.968 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.968 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.969 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.969 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.970 2 DEBUG nova.virt.hardware [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 21:55:37 compute-0 nova_compute[192716]: 2025-10-07 21:55:37.970 2 DEBUG nova.objects.instance [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:38 compute-0 sshd-session[217584]: Connection closed by invalid user admin 116.110.151.5 port 33910 [preauth]
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.404 2 DEBUG oslo_concurrency.lockutils [req-0225c4db-6f6c-42a4-a8a5-b79f29e22819 req-4f93edae-7410-4091-b5ef-5b3a4b88f4d3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-b581f70a-01a7-4dcb-a224-b1a4b738aab4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.477 2 DEBUG nova.objects.base [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<b581f70a-01a7-4dcb-a224-b1a4b738aab4> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.483 2 DEBUG oslo_concurrency.processutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.571 2 DEBUG oslo_concurrency.processutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk.config --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.572 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "/var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.572 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "/var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.573 2 DEBUG oslo_concurrency.lockutils [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "/var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.574 2 DEBUG nova.virt.libvirt.vif [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:54:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-210535964',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-210535964',id=8,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:54:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-h0vzfr0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:55:30Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=b581f70a-01a7-4dcb-a224-b1a4b738aab4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "vif_mac": "fa:16:3e:de:04:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.574 2 DEBUG nova.network.os_vif_util [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "vif_mac": "fa:16:3e:de:04:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.575 2 DEBUG nova.network.os_vif_util [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.577 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] End _get_guest_xml xml=<domain type="kvm">
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <uuid>b581f70a-01a7-4dcb-a224-b1a4b738aab4</uuid>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <name>instance-00000008</name>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-210535964</nova:name>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:55:37</nova:creationTime>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_input_bus">usb</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_machine_type">q35</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_video_model">virtio</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:property name="hw_vif_model">virtio</nova:property>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:user uuid="b71b837a81994b9694ede764e0406ac8">tempest-TestExecuteActionsViaActuator-1409880739-project-admin</nova:user>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:project uuid="42e6cb8a77b54158b2345b916b6fd79b">tempest-TestExecuteActionsViaActuator-1409880739</nova:project>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         <nova:port uuid="6ece1b44-4312-4bd7-9ca7-9592ec9faf78">
Oct 07 21:55:38 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <system>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <entry name="serial">b581f70a-01a7-4dcb-a224-b1a4b738aab4</entry>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <entry name="uuid">b581f70a-01a7-4dcb-a224-b1a4b738aab4</entry>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </system>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <os>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </os>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <features>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </features>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/disk.config"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:de:04:73"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <target dev="tap6ece1b44-43"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4/console.log" append="off"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <video>
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </video>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:55:38 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:55:38 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:55:38 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:55:38 compute-0 nova_compute[192716]: </domain>
Oct 07 21:55:38 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.583 2 DEBUG nova.virt.libvirt.vif [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:54:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-210535964',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-210535964',id=8,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:54:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-h0vzfr0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:55:30Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=b581f70a-01a7-4dcb-a224-b1a4b738aab4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "vif_mac": "fa:16:3e:de:04:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.584 2 DEBUG nova.network.os_vif_util [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "vif_mac": "fa:16:3e:de:04:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.585 2 DEBUG nova.network.os_vif_util [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.585 2 DEBUG os_vif [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.588 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.588 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3c7b8d18-dcf9-559e-962a-5729b22ee372', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.635 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ece1b44-43, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6ece1b44-43, col_values=(('qos', UUID('24a7b4d1-6591-44fd-aaac-32143bea268b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6ece1b44-43, col_values=(('external_ids', {'iface-id': '6ece1b44-4312-4bd7-9ca7-9592ec9faf78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:04:73', 'vm-uuid': 'b581f70a-01a7-4dcb-a224-b1a4b738aab4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:38 compute-0 NetworkManager[51722]: <info>  [1759874138.6428] manager: (tap6ece1b44-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:38 compute-0 nova_compute[192716]: 2025-10-07 21:55:38.654 2 INFO os_vif [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43')
Oct 07 21:55:39 compute-0 nova_compute[192716]: 2025-10-07 21:55:39.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 07 21:55:39 compute-0 systemd[217461]: Activating special unit Exit the Session...
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped target Main User Target.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped target Basic System.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped target Paths.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped target Sockets.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped target Timers.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 07 21:55:39 compute-0 systemd[217461]: Closed D-Bus User Message Bus Socket.
Oct 07 21:55:39 compute-0 systemd[217461]: Stopped Create User's Volatile Files and Directories.
Oct 07 21:55:39 compute-0 systemd[217461]: Removed slice User Application Slice.
Oct 07 21:55:39 compute-0 systemd[217461]: Reached target Shutdown.
Oct 07 21:55:39 compute-0 systemd[217461]: Finished Exit the Session.
Oct 07 21:55:39 compute-0 systemd[217461]: Reached target Exit the Session.
Oct 07 21:55:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 07 21:55:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 07 21:55:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 07 21:55:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 07 21:55:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 07 21:55:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 07 21:55:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.071 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.072 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00065|binding|INFO|Claiming lport a229ff5f-bd97-4beb-90ef-746057f7bbee for this chassis.
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00066|binding|INFO|a229ff5f-bd97-4beb-90ef-746057f7bbee: Claiming fa:16:3e:28:d9:17 10.100.0.3
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00067|binding|INFO|Setting lport a229ff5f-bd97-4beb-90ef-746057f7bbee up in Southbound
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.197 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.198 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.198 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No VIF found with MAC fa:16:3e:de:04:73, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.199 2 INFO nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Using config drive
Oct 07 21:55:40 compute-0 NetworkManager[51722]: <info>  [1759874140.2761] manager: (tap6ece1b44-43): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Oct 07 21:55:40 compute-0 kernel: tap6ece1b44-43: entered promiscuous mode
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00068|binding|INFO|Claiming lport 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for this chassis.
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00069|binding|INFO|6ece1b44-4312-4bd7-9ca7-9592ec9faf78: Claiming fa:16:3e:de:04:73 10.100.0.11
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.290 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:04:73 10.100.0.11'], port_security=['fa:16:3e:de:04:73 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b581f70a-01a7-4dcb-a224-b1a4b738aab4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=6ece1b44-4312-4bd7-9ca7-9592ec9faf78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.292 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e bound to our chassis
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.294 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00070|binding|INFO|Setting lport 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 ovn-installed in OVS
Oct 07 21:55:40 compute-0 ovn_controller[94904]: 2025-10-07T21:55:40Z|00071|binding|INFO|Setting lport 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 up in Southbound
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.326 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bad84cd8-53e4-4273-a632-7b9933e9f3ce]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:40 compute-0 systemd-udevd[217683]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:55:40 compute-0 systemd-machined[152719]: New machine qemu-6-instance-00000008.
Oct 07 21:55:40 compute-0 NetworkManager[51722]: <info>  [1759874140.3644] device (tap6ece1b44-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:55:40 compute-0 NetworkManager[51722]: <info>  [1759874140.3658] device (tap6ece1b44-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:55:40 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.378 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[95258a63-6ae5-4fe0-891c-ce234c0b0b8d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.382 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[87eaa254-17e3-4494-947e-930a45ea43fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.431 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[d1354c7e-96c4-4e2e-bc39-ff10c09e9c65]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.463 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[faa1a8c6-d738-4d25-bf94-9ee23c362206]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 11, 'rx_bytes': 1294, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 11, 'rx_bytes': 1294, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217692, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.488 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1724b8b9-c001-4f68-af8b-70a8ebe5be84]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217696, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217696, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.490 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:40 compute-0 nova_compute[192716]: 2025-10-07 21:55:40.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.495 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.495 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.496 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.496 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:40.498 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3f49a81a-4b8c-44c5-aa0e-adbf670dce76]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.265 2 INFO nova.compute.manager [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Post operation of migration started
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.266 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.391 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.393 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.411 2 DEBUG nova.compute.manager [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.415 2 INFO nova.virt.libvirt.driver [-] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Instance running successfully.
Oct 07 21:55:41 compute-0 virtqemud[192532]: argument unsupported: QEMU guest agent is not configured
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.419 2 DEBUG nova.virt.libvirt.guest [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.420 2 DEBUG nova.virt.libvirt.driver [None req-ed6cb4a5-731d-42ef-b453-e918f30c1b2f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.480 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.480 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.480 2 DEBUG nova.network.neutron [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:55:41 compute-0 nova_compute[192716]: 2025-10-07 21:55:41.986 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.169 2 DEBUG nova.compute.manager [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-plugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.170 2 DEBUG oslo_concurrency.lockutils [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.170 2 DEBUG oslo_concurrency.lockutils [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.171 2 DEBUG oslo_concurrency.lockutils [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.171 2 DEBUG nova.compute.manager [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] No waiting events found dispatching network-vif-plugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.172 2 WARNING nova.compute.manager [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received unexpected event network-vif-plugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for instance with vm_state resized and task_state None.
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.172 2 DEBUG nova.compute.manager [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-plugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.172 2 DEBUG oslo_concurrency.lockutils [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.173 2 DEBUG oslo_concurrency.lockutils [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.173 2 DEBUG oslo_concurrency.lockutils [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.173 2 DEBUG nova.compute.manager [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] No waiting events found dispatching network-vif-plugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:55:42 compute-0 nova_compute[192716]: 2025-10-07 21:55:42.174 2 WARNING nova.compute.manager [req-e1717716-ee07-4eb4-b670-c6effd20d1c4 req-89d7ac70-6852-4a37-a3b0-b4d174748c0f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received unexpected event network-vif-plugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for instance with vm_state resized and task_state None.
Oct 07 21:55:42 compute-0 podman[217707]: 2025-10-07 21:55:42.834386433 +0000 UTC m=+0.068625594 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:55:42 compute-0 podman[217706]: 2025-10-07 21:55:42.85630423 +0000 UTC m=+0.087026720 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid)
Oct 07 21:55:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:43.073 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:43 compute-0 nova_compute[192716]: 2025-10-07 21:55:43.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:43 compute-0 nova_compute[192716]: 2025-10-07 21:55:43.934 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:44 compute-0 nova_compute[192716]: 2025-10-07 21:55:44.084 2 DEBUG nova.network.neutron [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Updating instance_info_cache with network_info: [{"id": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "address": "fa:16:3e:28:d9:17", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa229ff5f-bd", "ovs_interfaceid": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:55:44 compute-0 nova_compute[192716]: 2025-10-07 21:55:44.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:44 compute-0 nova_compute[192716]: 2025-10-07 21:55:44.590 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:55:45 compute-0 nova_compute[192716]: 2025-10-07 21:55:45.113 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:45 compute-0 nova_compute[192716]: 2025-10-07 21:55:45.114 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:45 compute-0 nova_compute[192716]: 2025-10-07 21:55:45.114 2 DEBUG oslo_concurrency.lockutils [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:45 compute-0 nova_compute[192716]: 2025-10-07 21:55:45.119 2 INFO nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 21:55:45 compute-0 virtqemud[192532]: Domain id=5 name='instance-00000006' uuid=3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 is tainted: custom-monitor
Oct 07 21:55:46 compute-0 nova_compute[192716]: 2025-10-07 21:55:46.128 2 INFO nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 21:55:46 compute-0 podman[217746]: 2025-10-07 21:55:46.823602512 +0000 UTC m=+0.057846576 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:55:47 compute-0 nova_compute[192716]: 2025-10-07 21:55:47.137 2 INFO nova.virt.libvirt.driver [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 21:55:47 compute-0 nova_compute[192716]: 2025-10-07 21:55:47.143 2 DEBUG nova.compute.manager [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 21:55:47 compute-0 nova_compute[192716]: 2025-10-07 21:55:47.653 2 DEBUG nova.objects.instance [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 21:55:48 compute-0 nova_compute[192716]: 2025-10-07 21:55:48.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:48 compute-0 nova_compute[192716]: 2025-10-07 21:55:48.672 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:48 compute-0 nova_compute[192716]: 2025-10-07 21:55:48.774 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:48 compute-0 nova_compute[192716]: 2025-10-07 21:55:48.775 2 WARNING neutronclient.v2_0.client [None req-9a221faa-4a99-435d-8a10-b514cdc446c0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:49 compute-0 nova_compute[192716]: 2025-10-07 21:55:49.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:52 compute-0 ovn_controller[94904]: 2025-10-07T21:55:52Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:04:73 10.100.0.11
Oct 07 21:55:53 compute-0 nova_compute[192716]: 2025-10-07 21:55:53.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:53 compute-0 podman[217777]: 2025-10-07 21:55:53.883589289 +0000 UTC m=+0.109918244 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 21:55:54 compute-0 nova_compute[192716]: 2025-10-07 21:55:54.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:55 compute-0 podman[217805]: 2025-10-07 21:55:55.849188939 +0000 UTC m=+0.074687877 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.018 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.018 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.019 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.020 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.020 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.046 2 INFO nova.compute.manager [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Terminating instance
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.656 2 DEBUG nova.compute.manager [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 21:55:57 compute-0 kernel: tapffef8458-72 (unregistering): left promiscuous mode
Oct 07 21:55:57 compute-0 NetworkManager[51722]: <info>  [1759874157.6836] device (tapffef8458-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:55:57 compute-0 ovn_controller[94904]: 2025-10-07T21:55:57Z|00072|binding|INFO|Releasing lport ffef8458-72c0-4d1a-966e-e35470777c1a from this chassis (sb_readonly=0)
Oct 07 21:55:57 compute-0 ovn_controller[94904]: 2025-10-07T21:55:57Z|00073|binding|INFO|Setting lport ffef8458-72c0-4d1a-966e-e35470777c1a down in Southbound
Oct 07 21:55:57 compute-0 ovn_controller[94904]: 2025-10-07T21:55:57Z|00074|binding|INFO|Removing iface tapffef8458-72 ovn-installed in OVS
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.724 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:3a:79 10.100.0.14'], port_security=['fa:16:3e:71:3a:79 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=ffef8458-72c0-4d1a-966e-e35470777c1a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.724 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ffef8458-72c0-4d1a-966e-e35470777c1a in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.726 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.744 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[daf18b57-8ef0-48ca-a90d-57c8130fd152]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 07 21:55:57 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 14.635s CPU time.
Oct 07 21:55:57 compute-0 systemd-machined[152719]: Machine qemu-4-instance-00000009 terminated.
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.784 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[a61dd6a6-cbc2-4dc4-88eb-5a19e9c7293b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.787 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[114aed0a-5c0e-43bb-8415-da17623ba2be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.814 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[18ac07f0-b20b-4507-b28a-f1f8ea9d0a75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.831 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[916b97be-d795-4726-b9e2-c77129714776]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 13, 'rx_bytes': 1924, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 13, 'rx_bytes': 1924, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217835, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.847 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a2eb9584-e323-4dcd-b701-c5e1308465c0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217836, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217836, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.848 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.855 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.855 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.855 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.856 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:55:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:55:57.857 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5e209948-1c72-46d4-945c-403235d47b7b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.913 2 DEBUG nova.compute.manager [req-90fe67a6-fad4-4600-a0ed-62c78215147a req-fd106316-9cda-41b7-8397-38b29692ea3a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-unplugged-ffef8458-72c0-4d1a-966e-e35470777c1a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.914 2 DEBUG oslo_concurrency.lockutils [req-90fe67a6-fad4-4600-a0ed-62c78215147a req-fd106316-9cda-41b7-8397-38b29692ea3a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.914 2 DEBUG oslo_concurrency.lockutils [req-90fe67a6-fad4-4600-a0ed-62c78215147a req-fd106316-9cda-41b7-8397-38b29692ea3a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.915 2 DEBUG oslo_concurrency.lockutils [req-90fe67a6-fad4-4600-a0ed-62c78215147a req-fd106316-9cda-41b7-8397-38b29692ea3a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.915 2 DEBUG nova.compute.manager [req-90fe67a6-fad4-4600-a0ed-62c78215147a req-fd106316-9cda-41b7-8397-38b29692ea3a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] No waiting events found dispatching network-vif-unplugged-ffef8458-72c0-4d1a-966e-e35470777c1a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.915 2 DEBUG nova.compute.manager [req-90fe67a6-fad4-4600-a0ed-62c78215147a req-fd106316-9cda-41b7-8397-38b29692ea3a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-unplugged-ffef8458-72c0-4d1a-966e-e35470777c1a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.930 2 INFO nova.virt.libvirt.driver [-] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Instance destroyed successfully.
Oct 07 21:55:57 compute-0 nova_compute[192716]: 2025-10-07 21:55:57.931 2 DEBUG nova.objects.instance [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'resources' on Instance uuid 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.437 2 DEBUG nova.virt.libvirt.vif [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:54:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1048421019',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1048421019',id=9,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:55:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-w2f000da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:55:01Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.439 2 DEBUG nova.network.os_vif_util [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "ffef8458-72c0-4d1a-966e-e35470777c1a", "address": "fa:16:3e:71:3a:79", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffef8458-72", "ovs_interfaceid": "ffef8458-72c0-4d1a-966e-e35470777c1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.441 2 DEBUG nova.network.os_vif_util [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.442 2 DEBUG os_vif [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.446 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffef8458-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.455 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5ca33056-da1a-4715-b9e0-0c37c4506140) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.462 2 INFO os_vif [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:3a:79,bridge_name='br-int',has_traffic_filtering=True,id=ffef8458-72c0-4d1a-966e-e35470777c1a,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffef8458-72')
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.463 2 INFO nova.virt.libvirt.driver [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Deleting instance files /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8_del
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.465 2 INFO nova.virt.libvirt.driver [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Deletion of /var/lib/nova/instances/6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8_del complete
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.983 2 INFO nova.compute.manager [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.984 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.984 2 DEBUG nova.compute.manager [-] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.985 2 DEBUG nova.network.neutron [-] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 21:55:58 compute-0 nova_compute[192716]: 2025-10-07 21:55:58.985 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:59 compute-0 nova_compute[192716]: 2025-10-07 21:55:59.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:55:59 compute-0 podman[203153]: time="2025-10-07T21:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:55:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:55:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3473 "" "Go-http-client/1.1"
Oct 07 21:55:59 compute-0 nova_compute[192716]: 2025-10-07 21:55:59.801 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:55:59 compute-0 podman[217854]: 2025-10-07 21:55:59.851171302 +0000 UTC m=+0.079129624 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm)
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.082 2 DEBUG nova.compute.manager [req-d682ec5b-8dbf-4974-8d4a-2bc324cf3fde req-b48e034b-b321-4911-867e-9bbe97ebf0e7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-unplugged-ffef8458-72c0-4d1a-966e-e35470777c1a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.082 2 DEBUG oslo_concurrency.lockutils [req-d682ec5b-8dbf-4974-8d4a-2bc324cf3fde req-b48e034b-b321-4911-867e-9bbe97ebf0e7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.083 2 DEBUG oslo_concurrency.lockutils [req-d682ec5b-8dbf-4974-8d4a-2bc324cf3fde req-b48e034b-b321-4911-867e-9bbe97ebf0e7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.083 2 DEBUG oslo_concurrency.lockutils [req-d682ec5b-8dbf-4974-8d4a-2bc324cf3fde req-b48e034b-b321-4911-867e-9bbe97ebf0e7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.083 2 DEBUG nova.compute.manager [req-d682ec5b-8dbf-4974-8d4a-2bc324cf3fde req-b48e034b-b321-4911-867e-9bbe97ebf0e7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] No waiting events found dispatching network-vif-unplugged-ffef8458-72c0-4d1a-966e-e35470777c1a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.083 2 DEBUG nova.compute.manager [req-d682ec5b-8dbf-4974-8d4a-2bc324cf3fde req-b48e034b-b321-4911-867e-9bbe97ebf0e7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-unplugged-ffef8458-72c0-4d1a-966e-e35470777c1a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.899 2 DEBUG nova.compute.manager [req-d97d2a19-4f49-427d-a793-f5979b67f022 req-963b5116-fbde-4f05-9c9a-cbc6f00931c6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Received event network-vif-deleted-ffef8458-72c0-4d1a-966e-e35470777c1a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.899 2 INFO nova.compute.manager [req-d97d2a19-4f49-427d-a793-f5979b67f022 req-963b5116-fbde-4f05-9c9a-cbc6f00931c6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Neutron deleted interface ffef8458-72c0-4d1a-966e-e35470777c1a; detaching it from the instance and deleting it from the info cache
Oct 07 21:56:00 compute-0 nova_compute[192716]: 2025-10-07 21:56:00.899 2 DEBUG nova.network.neutron [req-d97d2a19-4f49-427d-a793-f5979b67f022 req-963b5116-fbde-4f05-9c9a-cbc6f00931c6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:01 compute-0 nova_compute[192716]: 2025-10-07 21:56:01.327 2 DEBUG nova.network.neutron [-] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:01 compute-0 nova_compute[192716]: 2025-10-07 21:56:01.406 2 DEBUG nova.compute.manager [req-d97d2a19-4f49-427d-a793-f5979b67f022 req-963b5116-fbde-4f05-9c9a-cbc6f00931c6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Detach interface failed, port_id=ffef8458-72c0-4d1a-966e-e35470777c1a, reason: Instance 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: ERROR   21:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: ERROR   21:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: ERROR   21:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: ERROR   21:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: ERROR   21:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:56:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:56:01 compute-0 nova_compute[192716]: 2025-10-07 21:56:01.832 2 INFO nova.compute.manager [-] [instance: 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8] Took 2.85 seconds to deallocate network for instance.
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.367 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.368 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.417 2 DEBUG nova.scheduler.client.report [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.436 2 DEBUG nova.scheduler.client.report [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.437 2 DEBUG nova.compute.provider_tree [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.454 2 DEBUG nova.scheduler.client.report [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.478 2 DEBUG nova.scheduler.client.report [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 21:56:02 compute-0 nova_compute[192716]: 2025-10-07 21:56:02.582 2 DEBUG nova.compute.provider_tree [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:56:03 compute-0 nova_compute[192716]: 2025-10-07 21:56:03.188 2 DEBUG nova.scheduler.client.report [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:56:03 compute-0 nova_compute[192716]: 2025-10-07 21:56:03.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:03 compute-0 nova_compute[192716]: 2025-10-07 21:56:03.853 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.484s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:03 compute-0 nova_compute[192716]: 2025-10-07 21:56:03.926 2 INFO nova.scheduler.client.report [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Deleted allocations for instance 6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8
Oct 07 21:56:04 compute-0 nova_compute[192716]: 2025-10-07 21:56:04.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:04 compute-0 nova_compute[192716]: 2025-10-07 21:56:04.955 2 DEBUG oslo_concurrency.lockutils [None req-92c0d516-f413-4321-8664-90737ed25d7a b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "6ef562ee-ffa5-46cd-9e56-1c0aa7ef16a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.937s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:06 compute-0 nova_compute[192716]: 2025-10-07 21:56:06.607 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:06 compute-0 nova_compute[192716]: 2025-10-07 21:56:06.607 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:06 compute-0 nova_compute[192716]: 2025-10-07 21:56:06.608 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:06 compute-0 nova_compute[192716]: 2025-10-07 21:56:06.608 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:06 compute-0 nova_compute[192716]: 2025-10-07 21:56:06.609 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:06 compute-0 nova_compute[192716]: 2025-10-07 21:56:06.626 2 INFO nova.compute.manager [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Terminating instance
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.145 2 DEBUG nova.compute.manager [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 21:56:07 compute-0 kernel: tap6ece1b44-43 (unregistering): left promiscuous mode
Oct 07 21:56:07 compute-0 NetworkManager[51722]: <info>  [1759874167.1685] device (tap6ece1b44-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:56:07 compute-0 ovn_controller[94904]: 2025-10-07T21:56:07Z|00075|binding|INFO|Releasing lport 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 from this chassis (sb_readonly=0)
Oct 07 21:56:07 compute-0 ovn_controller[94904]: 2025-10-07T21:56:07Z|00076|binding|INFO|Setting lport 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 down in Southbound
Oct 07 21:56:07 compute-0 ovn_controller[94904]: 2025-10-07T21:56:07Z|00077|binding|INFO|Removing iface tap6ece1b44-43 ovn-installed in OVS
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.197 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:04:73 10.100.0.11'], port_security=['fa:16:3e:de:04:73 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b581f70a-01a7-4dcb-a224-b1a4b738aab4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=6ece1b44-4312-4bd7-9ca7-9592ec9faf78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.198 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 6ece1b44-4312-4bd7-9ca7-9592ec9faf78 in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.200 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.221 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8375dbf1-8fbf-4576-a309-1a6deeb5347f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct 07 21:56:07 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 13.680s CPU time.
Oct 07 21:56:07 compute-0 systemd-machined[152719]: Machine qemu-6-instance-00000008 terminated.
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.262 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[a10f5e82-d168-4aa1-b0e8-2872b9b485ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.266 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[bdac5d5e-f5da-4c27-a8b9-0456b87745a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.305 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[33cbe22f-18e8-44cf-8c0f-999f34335e9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.333 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[63935450-8bee-4864-8c12-db12e65a78fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 2008, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 2008, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217886, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.344 2 DEBUG nova.compute.manager [req-0527dc97-d3ee-4f09-acad-eb26e59741ec req-116004c6-510a-44b0-8a5f-de5ba317d065 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.345 2 DEBUG oslo_concurrency.lockutils [req-0527dc97-d3ee-4f09-acad-eb26e59741ec req-116004c6-510a-44b0-8a5f-de5ba317d065 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.345 2 DEBUG oslo_concurrency.lockutils [req-0527dc97-d3ee-4f09-acad-eb26e59741ec req-116004c6-510a-44b0-8a5f-de5ba317d065 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.346 2 DEBUG oslo_concurrency.lockutils [req-0527dc97-d3ee-4f09-acad-eb26e59741ec req-116004c6-510a-44b0-8a5f-de5ba317d065 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.346 2 DEBUG nova.compute.manager [req-0527dc97-d3ee-4f09-acad-eb26e59741ec req-116004c6-510a-44b0-8a5f-de5ba317d065 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] No waiting events found dispatching network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.347 2 DEBUG nova.compute.manager [req-0527dc97-d3ee-4f09-acad-eb26e59741ec req-116004c6-510a-44b0-8a5f-de5ba317d065 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.356 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f557ed-cd47-4e5e-9a12-8aaf4f0ed68e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217887, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217887, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.358 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.407 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.407 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.408 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.408 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:56:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:07.409 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[969d6fdf-54cf-478e-95f8-23175b14a987]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.463 2 INFO nova.virt.libvirt.driver [-] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Instance destroyed successfully.
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.464 2 DEBUG nova.objects.instance [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'resources' on Instance uuid b581f70a-01a7-4dcb-a224-b1a4b738aab4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.974 2 DEBUG nova.virt.libvirt.vif [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:54:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-210535964',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-210535964',id=8,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:55:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-h0vzfr0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:55:52Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=b581f70a-01a7-4dcb-a224-b1a4b738aab4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.975 2 DEBUG nova.network.os_vif_util [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "address": "fa:16:3e:de:04:73", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ece1b44-43", "ovs_interfaceid": "6ece1b44-4312-4bd7-9ca7-9592ec9faf78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.975 2 DEBUG nova.network.os_vif_util [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.976 2 DEBUG os_vif [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.978 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ece1b44-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.983 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=24a7b4d1-6591-44fd-aaac-32143bea268b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.987 2 INFO os_vif [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:04:73,bridge_name='br-int',has_traffic_filtering=True,id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ece1b44-43')
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.988 2 INFO nova.virt.libvirt.driver [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Deleting instance files /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4_del
Oct 07 21:56:07 compute-0 nova_compute[192716]: 2025-10-07 21:56:07.991 2 INFO nova.virt.libvirt.driver [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Deletion of /var/lib/nova/instances/b581f70a-01a7-4dcb-a224-b1a4b738aab4_del complete
Oct 07 21:56:08 compute-0 nova_compute[192716]: 2025-10-07 21:56:08.524 2 INFO nova.compute.manager [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Took 1.38 seconds to destroy the instance on the hypervisor.
Oct 07 21:56:08 compute-0 nova_compute[192716]: 2025-10-07 21:56:08.525 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 21:56:08 compute-0 nova_compute[192716]: 2025-10-07 21:56:08.525 2 DEBUG nova.compute.manager [-] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 21:56:08 compute-0 nova_compute[192716]: 2025-10-07 21:56:08.525 2 DEBUG nova.network.neutron [-] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 21:56:08 compute-0 nova_compute[192716]: 2025-10-07 21:56:08.526 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:08 compute-0 nova_compute[192716]: 2025-10-07 21:56:08.760 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.545 2 DEBUG nova.compute.manager [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.546 2 DEBUG oslo_concurrency.lockutils [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.546 2 DEBUG oslo_concurrency.lockutils [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.547 2 DEBUG oslo_concurrency.lockutils [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.547 2 DEBUG nova.compute.manager [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] No waiting events found dispatching network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.547 2 DEBUG nova.compute.manager [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-unplugged-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.548 2 DEBUG nova.compute.manager [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Received event network-vif-deleted-6ece1b44-4312-4bd7-9ca7-9592ec9faf78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.548 2 INFO nova.compute.manager [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Neutron deleted interface 6ece1b44-4312-4bd7-9ca7-9592ec9faf78; detaching it from the instance and deleting it from the info cache
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.548 2 DEBUG nova.network.neutron [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:09 compute-0 nova_compute[192716]: 2025-10-07 21:56:09.561 2 DEBUG nova.network.neutron [-] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:10 compute-0 nova_compute[192716]: 2025-10-07 21:56:10.057 2 DEBUG nova.compute.manager [req-27fda094-1f8a-4ebf-95e3-51c86670631b req-e551f1f5-555c-4aaf-ac3f-49f9d6d92fe5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Detach interface failed, port_id=6ece1b44-4312-4bd7-9ca7-9592ec9faf78, reason: Instance b581f70a-01a7-4dcb-a224-b1a4b738aab4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 21:56:10 compute-0 nova_compute[192716]: 2025-10-07 21:56:10.069 2 INFO nova.compute.manager [-] [instance: b581f70a-01a7-4dcb-a224-b1a4b738aab4] Took 1.54 seconds to deallocate network for instance.
Oct 07 21:56:10 compute-0 nova_compute[192716]: 2025-10-07 21:56:10.592 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:10 compute-0 nova_compute[192716]: 2025-10-07 21:56:10.592 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:10 compute-0 nova_compute[192716]: 2025-10-07 21:56:10.598 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:10 compute-0 nova_compute[192716]: 2025-10-07 21:56:10.632 2 INFO nova.scheduler.client.report [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Deleted allocations for instance b581f70a-01a7-4dcb-a224-b1a4b738aab4
Oct 07 21:56:11 compute-0 nova_compute[192716]: 2025-10-07 21:56:11.663 2 DEBUG oslo_concurrency.lockutils [None req-0bc7d09a-99b1-4afa-aa1b-f1c6f31a3ad7 b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "b581f70a-01a7-4dcb-a224-b1a4b738aab4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.055s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:12 compute-0 nova_compute[192716]: 2025-10-07 21:56:12.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.060 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.060 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.061 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.061 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.061 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.072 2 INFO nova.compute.manager [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Terminating instance
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.589 2 DEBUG nova.compute.manager [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 21:56:13 compute-0 kernel: tapc20e63df-b9 (unregistering): left promiscuous mode
Oct 07 21:56:13 compute-0 NetworkManager[51722]: <info>  [1759874173.6144] device (tapc20e63df-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:56:13 compute-0 ovn_controller[94904]: 2025-10-07T21:56:13Z|00078|binding|INFO|Releasing lport c20e63df-b9ab-4daf-b7bb-502dff45fae0 from this chassis (sb_readonly=0)
Oct 07 21:56:13 compute-0 ovn_controller[94904]: 2025-10-07T21:56:13Z|00079|binding|INFO|Setting lport c20e63df-b9ab-4daf-b7bb-502dff45fae0 down in Southbound
Oct 07 21:56:13 compute-0 ovn_controller[94904]: 2025-10-07T21:56:13Z|00080|binding|INFO|Removing iface tapc20e63df-b9 ovn-installed in OVS
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.678 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:12:f3 10.100.0.10'], port_security=['fa:16:3e:95:12:f3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0b3d97e-60fb-487c-90d3-2b48392ff09f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=c20e63df-b9ab-4daf-b7bb-502dff45fae0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.679 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c20e63df-b9ab-4daf-b7bb-502dff45fae0 in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.681 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.701 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[13531335-4f03-44a1-b294-0b684ff8c517]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.740 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[905b0f83-d8cd-4abc-a54e-15bdc5ee2423]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 17.281s CPU time.
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.743 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d71e40-0206-4ccc-921a-0678384c6cf8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 systemd-machined[152719]: Machine qemu-3-instance-00000007 terminated.
Oct 07 21:56:13 compute-0 podman[217906]: 2025-10-07 21:56:13.776538911 +0000 UTC m=+0.067042987 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.780 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[4e1edc11-0064-4c3f-b77c-d87152b8bdec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 podman[217908]: 2025-10-07 21:56:13.800761224 +0000 UTC m=+0.091397994 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251007, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.804 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbce667-b2af-44b6-9512-d5dba9d52c73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 2008, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 2008, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217954, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.826 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3de73314-6d33-4d34-b706-c8005410bf9f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217958, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217958, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.827 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.832 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.833 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.833 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.833 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:56:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:13.834 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d42c80-9873-44b8-a9eb-cd925c30d763]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.857 2 DEBUG nova.compute.manager [req-2df11966-fcbc-4645-9569-fdf6624e2de0 req-6e8bc198-2ce8-4a10-9a2b-554c3c29be78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-unplugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.858 2 DEBUG oslo_concurrency.lockutils [req-2df11966-fcbc-4645-9569-fdf6624e2de0 req-6e8bc198-2ce8-4a10-9a2b-554c3c29be78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.859 2 DEBUG oslo_concurrency.lockutils [req-2df11966-fcbc-4645-9569-fdf6624e2de0 req-6e8bc198-2ce8-4a10-9a2b-554c3c29be78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.859 2 DEBUG oslo_concurrency.lockutils [req-2df11966-fcbc-4645-9569-fdf6624e2de0 req-6e8bc198-2ce8-4a10-9a2b-554c3c29be78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.860 2 DEBUG nova.compute.manager [req-2df11966-fcbc-4645-9569-fdf6624e2de0 req-6e8bc198-2ce8-4a10-9a2b-554c3c29be78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] No waiting events found dispatching network-vif-unplugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.860 2 DEBUG nova.compute.manager [req-2df11966-fcbc-4645-9569-fdf6624e2de0 req-6e8bc198-2ce8-4a10-9a2b-554c3c29be78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-unplugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.868 2 INFO nova.virt.libvirt.driver [-] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Instance destroyed successfully.
Oct 07 21:56:13 compute-0 nova_compute[192716]: 2025-10-07 21:56:13.869 2 DEBUG nova.objects.instance [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'resources' on Instance uuid c0b3d97e-60fb-487c-90d3-2b48392ff09f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.376 2 DEBUG nova.virt.libvirt.vif [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:53:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-707264346',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-707264346',id=7,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:54:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-0a4go6gs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:54:08Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=c0b3d97e-60fb-487c-90d3-2b48392ff09f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.377 2 DEBUG nova.network.os_vif_util [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "address": "fa:16:3e:95:12:f3", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc20e63df-b9", "ovs_interfaceid": "c20e63df-b9ab-4daf-b7bb-502dff45fae0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.378 2 DEBUG nova.network.os_vif_util [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.379 2 DEBUG os_vif [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc20e63df-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=04a29024-fefd-4825-aeab-19dd0c6c0706) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.393 2 INFO os_vif [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:12:f3,bridge_name='br-int',has_traffic_filtering=True,id=c20e63df-b9ab-4daf-b7bb-502dff45fae0,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc20e63df-b9')
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.394 2 INFO nova.virt.libvirt.driver [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Deleting instance files /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f_del
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.395 2 INFO nova.virt.libvirt.driver [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Deletion of /var/lib/nova/instances/c0b3d97e-60fb-487c-90d3-2b48392ff09f_del complete
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.911 2 INFO nova.compute.manager [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.912 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.913 2 DEBUG nova.compute.manager [-] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.913 2 DEBUG nova.network.neutron [-] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 21:56:14 compute-0 nova_compute[192716]: 2025-10-07 21:56:14.913 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.801 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.941 2 DEBUG nova.compute.manager [req-5e362f0a-b6ab-4e44-9fd5-434a20821964 req-751f428f-e423-4e0f-851a-8de2967d2c1f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-unplugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.942 2 DEBUG oslo_concurrency.lockutils [req-5e362f0a-b6ab-4e44-9fd5-434a20821964 req-751f428f-e423-4e0f-851a-8de2967d2c1f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.942 2 DEBUG oslo_concurrency.lockutils [req-5e362f0a-b6ab-4e44-9fd5-434a20821964 req-751f428f-e423-4e0f-851a-8de2967d2c1f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.942 2 DEBUG oslo_concurrency.lockutils [req-5e362f0a-b6ab-4e44-9fd5-434a20821964 req-751f428f-e423-4e0f-851a-8de2967d2c1f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.942 2 DEBUG nova.compute.manager [req-5e362f0a-b6ab-4e44-9fd5-434a20821964 req-751f428f-e423-4e0f-851a-8de2967d2c1f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] No waiting events found dispatching network-vif-unplugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.942 2 DEBUG nova.compute.manager [req-5e362f0a-b6ab-4e44-9fd5-434a20821964 req-751f428f-e423-4e0f-851a-8de2967d2c1f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-unplugged-c20e63df-b9ab-4daf-b7bb-502dff45fae0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:15 compute-0 nova_compute[192716]: 2025-10-07 21:56:15.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:16 compute-0 nova_compute[192716]: 2025-10-07 21:56:16.933 2 DEBUG nova.compute.manager [req-496ce265-bf14-4468-9edc-257fcfd3d4fc req-94d3a9d1-f3f4-4396-8af5-6797f0ed21fc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Received event network-vif-deleted-c20e63df-b9ab-4daf-b7bb-502dff45fae0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:16 compute-0 nova_compute[192716]: 2025-10-07 21:56:16.934 2 INFO nova.compute.manager [req-496ce265-bf14-4468-9edc-257fcfd3d4fc req-94d3a9d1-f3f4-4396-8af5-6797f0ed21fc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Neutron deleted interface c20e63df-b9ab-4daf-b7bb-502dff45fae0; detaching it from the instance and deleting it from the info cache
Oct 07 21:56:16 compute-0 nova_compute[192716]: 2025-10-07 21:56:16.934 2 DEBUG nova.network.neutron [req-496ce265-bf14-4468-9edc-257fcfd3d4fc req-94d3a9d1-f3f4-4396-8af5-6797f0ed21fc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:17 compute-0 nova_compute[192716]: 2025-10-07 21:56:17.348 2 DEBUG nova.network.neutron [-] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:17 compute-0 nova_compute[192716]: 2025-10-07 21:56:17.443 2 DEBUG nova.compute.manager [req-496ce265-bf14-4468-9edc-257fcfd3d4fc req-94d3a9d1-f3f4-4396-8af5-6797f0ed21fc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Detach interface failed, port_id=c20e63df-b9ab-4daf-b7bb-502dff45fae0, reason: Instance c0b3d97e-60fb-487c-90d3-2b48392ff09f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 21:56:17 compute-0 podman[217974]: 2025-10-07 21:56:17.855027063 +0000 UTC m=+0.084488378 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:56:17 compute-0 nova_compute[192716]: 2025-10-07 21:56:17.855 2 INFO nova.compute.manager [-] [instance: c0b3d97e-60fb-487c-90d3-2b48392ff09f] Took 2.94 seconds to deallocate network for instance.
Oct 07 21:56:18 compute-0 nova_compute[192716]: 2025-10-07 21:56:18.381 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:18 compute-0 nova_compute[192716]: 2025-10-07 21:56:18.382 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:18 compute-0 nova_compute[192716]: 2025-10-07 21:56:18.471 2 DEBUG nova.compute.provider_tree [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:56:18 compute-0 nova_compute[192716]: 2025-10-07 21:56:18.982 2 DEBUG nova.scheduler.client.report [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:56:19 compute-0 nova_compute[192716]: 2025-10-07 21:56:19.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:19 compute-0 nova_compute[192716]: 2025-10-07 21:56:19.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:19 compute-0 nova_compute[192716]: 2025-10-07 21:56:19.495 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:19 compute-0 nova_compute[192716]: 2025-10-07 21:56:19.530 2 INFO nova.scheduler.client.report [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Deleted allocations for instance c0b3d97e-60fb-487c-90d3-2b48392ff09f
Oct 07 21:56:20 compute-0 nova_compute[192716]: 2025-10-07 21:56:20.563 2 DEBUG oslo_concurrency.lockutils [None req-81bd21f5-6897-425b-b207-9e707b21613d b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "c0b3d97e-60fb-487c-90d3-2b48392ff09f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.503s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.297 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.298 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.299 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.299 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.299 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.313 2 INFO nova.compute.manager [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Terminating instance
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.836 2 DEBUG nova.compute.manager [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 21:56:21 compute-0 kernel: tapa229ff5f-bd (unregistering): left promiscuous mode
Oct 07 21:56:21 compute-0 NetworkManager[51722]: <info>  [1759874181.8665] device (tapa229ff5f-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:21 compute-0 ovn_controller[94904]: 2025-10-07T21:56:21Z|00081|binding|INFO|Releasing lport a229ff5f-bd97-4beb-90ef-746057f7bbee from this chassis (sb_readonly=0)
Oct 07 21:56:21 compute-0 ovn_controller[94904]: 2025-10-07T21:56:21Z|00082|binding|INFO|Setting lport a229ff5f-bd97-4beb-90ef-746057f7bbee down in Southbound
Oct 07 21:56:21 compute-0 ovn_controller[94904]: 2025-10-07T21:56:21Z|00083|binding|INFO|Removing iface tapa229ff5f-bd ovn-installed in OVS
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.880 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:d9:17 10.100.0.3'], port_security=['fa:16:3e:28:d9:17 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3407f49b-7e6b-4ff7-8ade-5caf647a9bd4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '15', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a229ff5f-bd97-4beb-90ef-746057f7bbee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.881 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a229ff5f-bd97-4beb-90ef-746057f7bbee in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.881 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0bd9c95-1d58-40c0-8d62-097453d85d3e
Oct 07 21:56:21 compute-0 nova_compute[192716]: 2025-10-07 21:56:21.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.904 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8fcef10a-e07e-41c6-9256-3bfc673fdd43]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:21 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct 07 21:56:21 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 4.362s CPU time.
Oct 07 21:56:21 compute-0 systemd-machined[152719]: Machine qemu-5-instance-00000006 terminated.
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.941 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ecfd78-5fe3-4134-8775-c162c24c5286]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.943 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e159ed06-689b-4d60-a34f-c102378fcc39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.971 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[93f8db0c-9f7d-40a0-9552-f74ab33bd8a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:21.989 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d84de87e-0440-4d3c-baa4-fab6e7094ea3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0bd9c95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 2008, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 2008, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369882, 'reachable_time': 24245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218011, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.007 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e5ab26-e4b7-40e1-9674-20471f3230eb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369895, 'tstamp': 369895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218012, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf0bd9c95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369898, 'tstamp': 369898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218012, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.009 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.016 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0bd9c95-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.016 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.016 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0bd9c95-10, col_values=(('external_ids', {'iface-id': 'c0a40c81-05dd-4977-aaa2-2a56498aa3a2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.017 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:56:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:22.018 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a707d072-76cb-41c9-9fdb-8ebd7cb8a43c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f0bd9c95-1d58-40c0-8d62-097453d85d3e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f0bd9c95-1d58-40c0-8d62-097453d85d3e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.107 2 INFO nova.virt.libvirt.driver [-] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Instance destroyed successfully.
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.108 2 DEBUG nova.objects.instance [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'resources' on Instance uuid 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.495 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.496 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.619 2 DEBUG nova.virt.libvirt.vif [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T21:53:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1675061367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1675061367',id=6,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:53:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-irf06lw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:55:48Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=3407f49b-7e6b-4ff7-8ade-5caf647a9bd4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "address": "fa:16:3e:28:d9:17", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa229ff5f-bd", "ovs_interfaceid": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.620 2 DEBUG nova.network.os_vif_util [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "address": "fa:16:3e:28:d9:17", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa229ff5f-bd", "ovs_interfaceid": "a229ff5f-bd97-4beb-90ef-746057f7bbee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.621 2 DEBUG nova.network.os_vif_util [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:d9:17,bridge_name='br-int',has_traffic_filtering=True,id=a229ff5f-bd97-4beb-90ef-746057f7bbee,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa229ff5f-bd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.621 2 DEBUG os_vif [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:d9:17,bridge_name='br-int',has_traffic_filtering=True,id=a229ff5f-bd97-4beb-90ef-746057f7bbee,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa229ff5f-bd') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.622 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa229ff5f-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=db9f03c6-a578-4457-abbf-c54a3865eb29) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.631 2 INFO os_vif [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:d9:17,bridge_name='br-int',has_traffic_filtering=True,id=a229ff5f-bd97-4beb-90ef-746057f7bbee,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa229ff5f-bd')
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.632 2 INFO nova.virt.libvirt.driver [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Deleting instance files /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4_del
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.632 2 INFO nova.virt.libvirt.driver [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Deletion of /var/lib/nova/instances/3407f49b-7e6b-4ff7-8ade-5caf647a9bd4_del complete
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.962 2 DEBUG nova.compute.manager [req-8cab8c06-a355-4fd9-9985-ec30c571a1e0 req-c0eb0e2b-0c1e-4f12-aefb-e6132eccf01d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Received event network-vif-unplugged-a229ff5f-bd97-4beb-90ef-746057f7bbee external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.962 2 DEBUG oslo_concurrency.lockutils [req-8cab8c06-a355-4fd9-9985-ec30c571a1e0 req-c0eb0e2b-0c1e-4f12-aefb-e6132eccf01d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.962 2 DEBUG oslo_concurrency.lockutils [req-8cab8c06-a355-4fd9-9985-ec30c571a1e0 req-c0eb0e2b-0c1e-4f12-aefb-e6132eccf01d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.963 2 DEBUG oslo_concurrency.lockutils [req-8cab8c06-a355-4fd9-9985-ec30c571a1e0 req-c0eb0e2b-0c1e-4f12-aefb-e6132eccf01d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.963 2 DEBUG nova.compute.manager [req-8cab8c06-a355-4fd9-9985-ec30c571a1e0 req-c0eb0e2b-0c1e-4f12-aefb-e6132eccf01d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] No waiting events found dispatching network-vif-unplugged-a229ff5f-bd97-4beb-90ef-746057f7bbee pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.964 2 DEBUG nova.compute.manager [req-8cab8c06-a355-4fd9-9985-ec30c571a1e0 req-c0eb0e2b-0c1e-4f12-aefb-e6132eccf01d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Received event network-vif-unplugged-a229ff5f-bd97-4beb-90ef-746057f7bbee for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:22 compute-0 nova_compute[192716]: 2025-10-07 21:56:22.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.144 2 INFO nova.compute.manager [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.145 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.145 2 DEBUG nova.compute.manager [-] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.146 2 DEBUG nova.network.neutron [-] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.146 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.804 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:23 compute-0 nova_compute[192716]: 2025-10-07 21:56:23.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:24 compute-0 nova_compute[192716]: 2025-10-07 21:56:24.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:24 compute-0 podman[218031]: 2025-10-07 21:56:24.887985287 +0000 UTC m=+0.113246040 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 07 21:56:24 compute-0 nova_compute[192716]: 2025-10-07 21:56:24.918 2 DEBUG nova.compute.manager [req-775d0daf-8b17-4a54-8f14-209c92106bc4 req-97770b73-d6ba-4df0-8c68-dd9b5c80c667 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Received event network-vif-deleted-a229ff5f-bd97-4beb-90ef-746057f7bbee external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:24 compute-0 nova_compute[192716]: 2025-10-07 21:56:24.918 2 INFO nova.compute.manager [req-775d0daf-8b17-4a54-8f14-209c92106bc4 req-97770b73-d6ba-4df0-8c68-dd9b5c80c667 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Neutron deleted interface a229ff5f-bd97-4beb-90ef-746057f7bbee; detaching it from the instance and deleting it from the info cache
Oct 07 21:56:24 compute-0 nova_compute[192716]: 2025-10-07 21:56:24.918 2 DEBUG nova.network.neutron [req-775d0daf-8b17-4a54-8f14-209c92106bc4 req-97770b73-d6ba-4df0-8c68-dd9b5c80c667 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:24 compute-0 nova_compute[192716]: 2025-10-07 21:56:24.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:24 compute-0 nova_compute[192716]: 2025-10-07 21:56:24.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.015 2 DEBUG nova.compute.manager [req-8d9e86f5-06a6-42f3-b63b-e7e8f956b460 req-3a0204ab-3915-4df9-8bc8-fab776c7fbb5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Received event network-vif-unplugged-a229ff5f-bd97-4beb-90ef-746057f7bbee external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.016 2 DEBUG oslo_concurrency.lockutils [req-8d9e86f5-06a6-42f3-b63b-e7e8f956b460 req-3a0204ab-3915-4df9-8bc8-fab776c7fbb5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.016 2 DEBUG oslo_concurrency.lockutils [req-8d9e86f5-06a6-42f3-b63b-e7e8f956b460 req-3a0204ab-3915-4df9-8bc8-fab776c7fbb5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.016 2 DEBUG oslo_concurrency.lockutils [req-8d9e86f5-06a6-42f3-b63b-e7e8f956b460 req-3a0204ab-3915-4df9-8bc8-fab776c7fbb5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.017 2 DEBUG nova.compute.manager [req-8d9e86f5-06a6-42f3-b63b-e7e8f956b460 req-3a0204ab-3915-4df9-8bc8-fab776c7fbb5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] No waiting events found dispatching network-vif-unplugged-a229ff5f-bd97-4beb-90ef-746057f7bbee pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.017 2 DEBUG nova.compute.manager [req-8d9e86f5-06a6-42f3-b63b-e7e8f956b460 req-3a0204ab-3915-4df9-8bc8-fab776c7fbb5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Received event network-vif-unplugged-a229ff5f-bd97-4beb-90ef-746057f7bbee for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.376 2 DEBUG nova.network.neutron [-] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.424 2 DEBUG nova.compute.manager [req-775d0daf-8b17-4a54-8f14-209c92106bc4 req-97770b73-d6ba-4df0-8c68-dd9b5c80c667 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Detach interface failed, port_id=a229ff5f-bd97-4beb-90ef-746057f7bbee, reason: Instance 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 21:56:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:25.615 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:25.616 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:25.616 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:25 compute-0 nova_compute[192716]: 2025-10-07 21:56:25.883 2 INFO nova.compute.manager [-] [instance: 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4] Took 2.74 seconds to deallocate network for instance.
Oct 07 21:56:26 compute-0 nova_compute[192716]: 2025-10-07 21:56:26.404 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:26 compute-0 nova_compute[192716]: 2025-10-07 21:56:26.405 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:26 compute-0 nova_compute[192716]: 2025-10-07 21:56:26.412 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:26 compute-0 nova_compute[192716]: 2025-10-07 21:56:26.444 2 INFO nova.scheduler.client.report [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Deleted allocations for instance 3407f49b-7e6b-4ff7-8ade-5caf647a9bd4
Oct 07 21:56:26 compute-0 nova_compute[192716]: 2025-10-07 21:56:26.495 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:26 compute-0 nova_compute[192716]: 2025-10-07 21:56:26.496 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 21:56:26 compute-0 podman[218058]: 2025-10-07 21:56:26.858689144 +0000 UTC m=+0.093358442 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 21:56:27 compute-0 nova_compute[192716]: 2025-10-07 21:56:27.001 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 21:56:27 compute-0 nova_compute[192716]: 2025-10-07 21:56:27.483 2 DEBUG oslo_concurrency.lockutils [None req-3bbbc56e-5146-4322-969e-422481f2332c b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "3407f49b-7e6b-4ff7-8ade-5caf647a9bd4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.185s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:27 compute-0 nova_compute[192716]: 2025-10-07 21:56:27.497 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:27 compute-0 nova_compute[192716]: 2025-10-07 21:56:27.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:27 compute-0 nova_compute[192716]: 2025-10-07 21:56:27.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:27 compute-0 nova_compute[192716]: 2025-10-07 21:56:27.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:28 compute-0 nova_compute[192716]: 2025-10-07 21:56:28.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:28 compute-0 nova_compute[192716]: 2025-10-07 21:56:28.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:28 compute-0 nova_compute[192716]: 2025-10-07 21:56:28.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:28 compute-0 nova_compute[192716]: 2025-10-07 21:56:28.503 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.555 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.614 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.615 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.668 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:56:29 compute-0 podman[203153]: time="2025-10-07T21:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:56:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:56:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.809 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.810 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.830 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.831 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5643MB free_disk=73.27760314941406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.831 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:29 compute-0 nova_compute[192716]: 2025-10-07 21:56:29.832 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.466 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.467 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.467 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.467 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.468 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.480 2 INFO nova.compute.manager [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Terminating instance
Oct 07 21:56:30 compute-0 podman[218085]: 2025-10-07 21:56:30.842010524 +0000 UTC m=+0.076015526 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.869 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 5138bd92-9a6e-4088-b0b2-bee3a14683ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.869 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.870 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:56:29 up  1:05,  0 user,  load average: 0.89, 0.44, 0.42\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_42e6cb8a77b54158b2345b916b6fd79b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.909 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:56:30 compute-0 nova_compute[192716]: 2025-10-07 21:56:30.996 2 DEBUG nova.compute.manager [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 21:56:31 compute-0 kernel: tap0f47f8fd-d8 (unregistering): left promiscuous mode
Oct 07 21:56:31 compute-0 NetworkManager[51722]: <info>  [1759874191.0351] device (tap0f47f8fd-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00084|binding|INFO|Releasing lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce from this chassis (sb_readonly=0)
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00085|binding|INFO|Setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce down in Southbound
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00086|binding|INFO|Removing iface tap0f47f8fd-d8 ovn-installed in OVS
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.081 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:65:92 10.100.0.9'], port_security=['fa:16:3e:fb:65:92 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5138bd92-9a6e-4088-b0b2-bee3a14683ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.082 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.083 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0bd9c95-1d58-40c0-8d62-097453d85d3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.084 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7f9951-e754-4461-85bd-45eff8a728e6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.085 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e namespace which is not needed anymore
Oct 07 21:56:31 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 07 21:56:31 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 22.033s CPU time.
Oct 07 21:56:31 compute-0 systemd-machined[152719]: Machine qemu-2-instance-00000005 terminated.
Oct 07 21:56:31 compute-0 kernel: tap0f47f8fd-d8: entered promiscuous mode
Oct 07 21:56:31 compute-0 NetworkManager[51722]: <info>  [1759874191.2187] manager: (tap0f47f8fd-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 kernel: tap0f47f8fd-d8 (unregistering): left promiscuous mode
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00087|binding|INFO|Claiming lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for this chassis.
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00088|binding|INFO|0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce: Claiming fa:16:3e:fb:65:92 10.100.0.9
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.234 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:65:92 10.100.0.9'], port_security=['fa:16:3e:fb:65:92 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5138bd92-9a6e-4088-b0b2-bee3a14683ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00089|binding|INFO|Setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce ovn-installed in OVS
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00090|binding|INFO|Setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce up in Southbound
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00091|binding|INFO|Releasing lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce from this chassis (sb_readonly=1)
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00092|if_status|INFO|Not setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce down as sb is readonly
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00093|binding|INFO|Removing iface tap0f47f8fd-d8 ovn-installed in OVS
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00094|binding|INFO|Releasing lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce from this chassis (sb_readonly=0)
Oct 07 21:56:31 compute-0 ovn_controller[94904]: 2025-10-07T21:56:31Z|00095|binding|INFO|Setting lport 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce down in Southbound
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.256 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:65:92 10.100.0.9'], port_security=['fa:16:3e:fb:65:92 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5138bd92-9a6e-4088-b0b2-bee3a14683ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '42e6cb8a77b54158b2345b916b6fd79b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0b409cfc-ce5d-4372-a7fd-bd2f8e7211c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=badb36bd-51e1-4b06-9dec-6b9bc7164000, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [NOTICE]   (216550) : haproxy version is 3.0.5-8e879a5
Oct 07 21:56:31 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [NOTICE]   (216550) : path to executable is /usr/sbin/haproxy
Oct 07 21:56:31 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [WARNING]  (216550) : Exiting Master process...
Oct 07 21:56:31 compute-0 podman[218132]: 2025-10-07 21:56:31.266025021 +0000 UTC m=+0.056855787 container kill e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:56:31 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [ALERT]    (216550) : Current worker (216552) exited with code 143 (Terminated)
Oct 07 21:56:31 compute-0 neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e[216546]: [WARNING]  (216550) : All workers exited. Exiting... (0)
Oct 07 21:56:31 compute-0 systemd[1]: libpod-e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6.scope: Deactivated successfully.
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.286 2 INFO nova.virt.libvirt.driver [-] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Instance destroyed successfully.
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.287 2 DEBUG nova.objects.instance [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lazy-loading 'resources' on Instance uuid 5138bd92-9a6e-4088-b0b2-bee3a14683ac obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:56:31 compute-0 podman[218154]: 2025-10-07 21:56:31.310625697 +0000 UTC m=+0.028234059 container died e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.417 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: ERROR   21:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: ERROR   21:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: ERROR   21:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:56:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6-userdata-shm.mount: Deactivated successfully.
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: ERROR   21:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: ERROR   21:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:56:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:56:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-efa3994277d38f899a53625af013c8b4fd470a62ea7c1116dfc5debd5a91effa-merged.mount: Deactivated successfully.
Oct 07 21:56:31 compute-0 podman[218154]: 2025-10-07 21:56:31.443175738 +0000 UTC m=+0.160784120 container cleanup e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 21:56:31 compute-0 systemd[1]: libpod-conmon-e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6.scope: Deactivated successfully.
Oct 07 21:56:31 compute-0 podman[218169]: 2025-10-07 21:56:31.470596472 +0000 UTC m=+0.150887336 container remove e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.479 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e9cb41-f837-4dd5-81f8-2ea19da679f2]: (4, ("Tue Oct  7 09:56:31 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e (e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6)\ne780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6\nTue Oct  7 09:56:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e (e780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6)\ne780af2d5ae58d6f3b8b43c56e407e8240a0f5ef6fc010fc71d82ffcdd3d24b6\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.480 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2fefb56b-20a9-4f53-aace-4946be9df9c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.481 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0bd9c95-1d58-40c0-8d62-097453d85d3e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.482 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2da9e8-3172-475a-9990-b6e8143a98ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.482 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0bd9c95-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 kernel: tapf0bd9c95-10: left promiscuous mode
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.502 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[18562730-e738-4f67-92d1-28907d3ed82a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.525 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6d720fe9-d832-41a7-826e-c33c818e81cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.528 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[054c1628-bf12-44ee-baee-9eda86f5ca61]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.542 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fff16347-2cd5-4927-9f07-7fa731bf72b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369873, 'reachable_time': 16408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218188, 'error': None, 'target': 'ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 systemd[1]: run-netns-ovnmeta\x2df0bd9c95\x2d1d58\x2d40c0\x2d8d62\x2d097453d85d3e.mount: Deactivated successfully.
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.551 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f0bd9c95-1d58-40c0-8d62-097453d85d3e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.551 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[d575b3d1-de98-4d79-b4ca-15665a0239ac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.552 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.554 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0bd9c95-1d58-40c0-8d62-097453d85d3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.555 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bb84c0d5-0294-4021-9fc9-fb3ddf451d3a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.555 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce in datapath f0bd9c95-1d58-40c0-8d62-097453d85d3e unbound from our chassis
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.557 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0bd9c95-1d58-40c0-8d62-097453d85d3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:56:31 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:31.557 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3374a014-a604-4c61-b382-3d1bd9c3ba6d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.928 2 DEBUG nova.virt.libvirt.vif [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:52:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1913099881',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1913099881',id=5,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:52:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='42e6cb8a77b54158b2345b916b6fd79b',ramdisk_id='',reservation_id='r-0x9q5j10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1409880739',owner_user_name='tempest-TestExecuteActionsViaActuator-1409880739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:52:47Z,user_data=None,user_id='b71b837a81994b9694ede764e0406ac8',uuid=5138bd92-9a6e-4088-b0b2-bee3a14683ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.928 2 DEBUG nova.network.os_vif_util [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converting VIF {"id": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "address": "fa:16:3e:fb:65:92", "network": {"id": "f0bd9c95-1d58-40c0-8d62-097453d85d3e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-309070824-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97ef6e0949aa4dd8b3ac7e1495d532e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f47f8fd-d8", "ovs_interfaceid": "0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.929 2 DEBUG nova.network.os_vif_util [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.930 2 DEBUG os_vif [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.934 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.934 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f47f8fd-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.941 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0bd7538c-443f-46e3-a06a-8cc1ba8bded3) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.945 2 INFO os_vif [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:65:92,bridge_name='br-int',has_traffic_filtering=True,id=0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce,network=Network(f0bd9c95-1d58-40c0-8d62-097453d85d3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f47f8fd-d8')
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.946 2 INFO nova.virt.libvirt.driver [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Deleting instance files /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac_del
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.947 2 INFO nova.virt.libvirt.driver [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Deletion of /var/lib/nova/instances/5138bd92-9a6e-4088-b0b2-bee3a14683ac_del complete
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.957 2 DEBUG nova.compute.manager [req-bab52690-3913-4afc-bfeb-6acf29f75da3 req-d01fb38c-727d-4a53-a20f-cba63d770a48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.958 2 DEBUG oslo_concurrency.lockutils [req-bab52690-3913-4afc-bfeb-6acf29f75da3 req-d01fb38c-727d-4a53-a20f-cba63d770a48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.959 2 DEBUG oslo_concurrency.lockutils [req-bab52690-3913-4afc-bfeb-6acf29f75da3 req-d01fb38c-727d-4a53-a20f-cba63d770a48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.959 2 DEBUG oslo_concurrency.lockutils [req-bab52690-3913-4afc-bfeb-6acf29f75da3 req-d01fb38c-727d-4a53-a20f-cba63d770a48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.960 2 DEBUG nova.compute.manager [req-bab52690-3913-4afc-bfeb-6acf29f75da3 req-d01fb38c-727d-4a53-a20f-cba63d770a48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:31 compute-0 nova_compute[192716]: 2025-10-07 21:56:31.961 2 DEBUG nova.compute.manager [req-bab52690-3913-4afc-bfeb-6acf29f75da3 req-d01fb38c-727d-4a53-a20f-cba63d770a48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.468 2 INFO nova.compute.manager [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Took 1.47 seconds to destroy the instance on the hypervisor.
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.469 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.469 2 DEBUG nova.compute.manager [-] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.470 2 DEBUG nova.network.neutron [-] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.470 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.564 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:56:32 compute-0 nova_compute[192716]: 2025-10-07 21:56:32.935 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:56:33 compute-0 nova_compute[192716]: 2025-10-07 21:56:33.321 2 DEBUG nova.network.neutron [-] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:56:33 compute-0 nova_compute[192716]: 2025-10-07 21:56:33.827 2 INFO nova.compute.manager [-] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Took 1.36 seconds to deallocate network for instance.
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.009 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.009 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.009 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.010 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.010 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.010 2 WARNING nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received unexpected event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with vm_state deleted and task_state None.
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.011 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.011 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.011 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.011 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.011 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.012 2 WARNING nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received unexpected event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with vm_state deleted and task_state None.
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.012 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.012 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.012 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.012 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.013 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.013 2 WARNING nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received unexpected event network-vif-plugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with vm_state deleted and task_state None.
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.013 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.013 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.014 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.014 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.014 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.014 2 WARNING nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received unexpected event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with vm_state deleted and task_state None.
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.015 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.015 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.015 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.015 2 DEBUG oslo_concurrency.lockutils [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.015 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] No waiting events found dispatching network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.016 2 WARNING nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received unexpected event network-vif-unplugged-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce for instance with vm_state deleted and task_state None.
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.016 2 DEBUG nova.compute.manager [req-d3fbfb0e-7465-4b6b-bfcd-42ba88b7424d req-5f9a3b7f-ab91-4f82-a855-59834442e7e5 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5138bd92-9a6e-4088-b0b2-bee3a14683ac] Received event network-vif-deleted-0f47f8fd-d86c-48a0-a8b1-c8e6ee8e65ce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.357 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.358 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.419 2 DEBUG nova.compute.provider_tree [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:56:34 compute-0 nova_compute[192716]: 2025-10-07 21:56:34.926 2 DEBUG nova.scheduler.client.report [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:56:35 compute-0 nova_compute[192716]: 2025-10-07 21:56:35.441 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:35 compute-0 nova_compute[192716]: 2025-10-07 21:56:35.468 2 INFO nova.scheduler.client.report [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Deleted allocations for instance 5138bd92-9a6e-4088-b0b2-bee3a14683ac
Oct 07 21:56:36 compute-0 nova_compute[192716]: 2025-10-07 21:56:36.492 2 DEBUG oslo_concurrency.lockutils [None req-3cdfa1c6-3bae-49c0-bffb-c3ab222bd1ab b71b837a81994b9694ede764e0406ac8 42e6cb8a77b54158b2345b916b6fd79b - - default default] Lock "5138bd92-9a6e-4088-b0b2-bee3a14683ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.025s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:56:36 compute-0 sshd-session[218189]: Invalid user psybnc from 116.110.151.5 port 54994
Oct 07 21:56:36 compute-0 nova_compute[192716]: 2025-10-07 21:56:36.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:37 compute-0 sshd-session[218189]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:56:37 compute-0 sshd-session[218189]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:56:39 compute-0 sshd-session[218189]: Failed password for invalid user psybnc from 116.110.151.5 port 54994 ssh2
Oct 07 21:56:39 compute-0 nova_compute[192716]: 2025-10-07 21:56:39.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:40 compute-0 sshd-session[218189]: Connection closed by invalid user psybnc 116.110.151.5 port 54994 [preauth]
Oct 07 21:56:41 compute-0 nova_compute[192716]: 2025-10-07 21:56:41.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:42 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:42.819 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:56:42 compute-0 nova_compute[192716]: 2025-10-07 21:56:42.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:42 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:42.821 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:56:44 compute-0 nova_compute[192716]: 2025-10-07 21:56:44.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:44 compute-0 podman[218192]: 2025-10-07 21:56:44.836286165 +0000 UTC m=+0.073338659 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Oct 07 21:56:44 compute-0 podman[218193]: 2025-10-07 21:56:44.853212759 +0000 UTC m=+0.080914236 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251007, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:56:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:56:45.823 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:56:46 compute-0 nova_compute[192716]: 2025-10-07 21:56:46.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:48 compute-0 podman[218232]: 2025-10-07 21:56:48.817960877 +0000 UTC m=+0.055908030 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:56:49 compute-0 nova_compute[192716]: 2025-10-07 21:56:49.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:51 compute-0 nova_compute[192716]: 2025-10-07 21:56:51.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:52 compute-0 nova_compute[192716]: 2025-10-07 21:56:52.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:54 compute-0 nova_compute[192716]: 2025-10-07 21:56:54.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:55 compute-0 podman[218256]: 2025-10-07 21:56:55.871578483 +0000 UTC m=+0.106451956 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct 07 21:56:57 compute-0 nova_compute[192716]: 2025-10-07 21:56:57.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:57 compute-0 podman[218282]: 2025-10-07 21:56:57.818576039 +0000 UTC m=+0.062615802 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 07 21:56:59 compute-0 sshd-session[218302]: Invalid user matrix from 116.110.151.5 port 44662
Oct 07 21:56:59 compute-0 nova_compute[192716]: 2025-10-07 21:56:59.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:56:59 compute-0 podman[203153]: time="2025-10-07T21:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:56:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:56:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Oct 07 21:57:00 compute-0 sshd-session[218302]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:57:00 compute-0 sshd-session[218302]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=116.110.151.5
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: ERROR   21:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: ERROR   21:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: ERROR   21:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: ERROR   21:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: ERROR   21:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:57:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:57:01 compute-0 podman[218304]: 2025-10-07 21:57:01.858416556 +0000 UTC m=+0.090818909 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal)
Oct 07 21:57:01 compute-0 sshd-session[218302]: Failed password for invalid user matrix from 116.110.151.5 port 44662 ssh2
Oct 07 21:57:02 compute-0 nova_compute[192716]: 2025-10-07 21:57:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:04 compute-0 sshd-session[218302]: Connection closed by invalid user matrix 116.110.151.5 port 44662 [preauth]
Oct 07 21:57:04 compute-0 nova_compute[192716]: 2025-10-07 21:57:04.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:05.046 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:0d:c9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8350938-9140-4914-9fd9-576fae51c662', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a049bf0f330a49e7aa11cf49f3632f49', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2ae172e-4471-427f-9271-ad3259dec3ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a135f816-2409-493b-84b9-25f42a19538e) old=Port_Binding(mac=['fa:16:3e:3d:0d:c9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8350938-9140-4914-9fd9-576fae51c662', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a049bf0f330a49e7aa11cf49f3632f49', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:57:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:05.047 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a135f816-2409-493b-84b9-25f42a19538e in datapath b8350938-9140-4914-9fd9-576fae51c662 updated
Oct 07 21:57:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:05.048 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8350938-9140-4914-9fd9-576fae51c662, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:57:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:05.049 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8e3106-affd-45c6-b3e3-8113215c894b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:57:07 compute-0 nova_compute[192716]: 2025-10-07 21:57:07.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:09 compute-0 nova_compute[192716]: 2025-10-07 21:57:09.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:12 compute-0 nova_compute[192716]: 2025-10-07 21:57:12.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:13.933 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:b9:ad 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-85dd9226-1f75-45ab-b34a-7bd010edb097', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85dd9226-1f75-45ab-b34a-7bd010edb097', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2aab90dc44084cef89c9f41e873a0e5b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3680850-d3a8-49ba-8f06-daa11627bfb6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=de058bed-3e48-427e-b1da-17fc0b77286b) old=Port_Binding(mac=['fa:16:3e:31:b9:ad'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-85dd9226-1f75-45ab-b34a-7bd010edb097', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85dd9226-1f75-45ab-b34a-7bd010edb097', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2aab90dc44084cef89c9f41e873a0e5b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:57:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:13.934 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port de058bed-3e48-427e-b1da-17fc0b77286b in datapath 85dd9226-1f75-45ab-b34a-7bd010edb097 updated
Oct 07 21:57:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:13.935 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85dd9226-1f75-45ab-b34a-7bd010edb097, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:57:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:13.935 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ac3a5f84-00a4-4035-8f20-973ddc82cc8d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:57:14 compute-0 nova_compute[192716]: 2025-10-07 21:57:14.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:15 compute-0 podman[218325]: 2025-10-07 21:57:15.824266434 +0000 UTC m=+0.055085597 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 21:57:15 compute-0 podman[218324]: 2025-10-07 21:57:15.841240859 +0000 UTC m=+0.066007059 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 21:57:17 compute-0 nova_compute[192716]: 2025-10-07 21:57:17.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:19 compute-0 nova_compute[192716]: 2025-10-07 21:57:19.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:19 compute-0 podman[218366]: 2025-10-07 21:57:19.860371254 +0000 UTC m=+0.091186050 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 21:57:21 compute-0 nova_compute[192716]: 2025-10-07 21:57:21.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:22 compute-0 nova_compute[192716]: 2025-10-07 21:57:22.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:22 compute-0 nova_compute[192716]: 2025-10-07 21:57:22.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:22 compute-0 nova_compute[192716]: 2025-10-07 21:57:22.989 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:57:23 compute-0 nova_compute[192716]: 2025-10-07 21:57:23.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:23 compute-0 nova_compute[192716]: 2025-10-07 21:57:23.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:24 compute-0 nova_compute[192716]: 2025-10-07 21:57:24.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:25 compute-0 ovn_controller[94904]: 2025-10-07T21:57:25Z|00096|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 07 21:57:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:25.617 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:25.617 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:25.617 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:26 compute-0 podman[218391]: 2025-10-07 21:57:26.85699098 +0000 UTC m=+0.100002222 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 07 21:57:27 compute-0 nova_compute[192716]: 2025-10-07 21:57:27.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:27 compute-0 nova_compute[192716]: 2025-10-07 21:57:27.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:28 compute-0 nova_compute[192716]: 2025-10-07 21:57:28.494 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:28 compute-0 nova_compute[192716]: 2025-10-07 21:57:28.494 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:28 compute-0 podman[218418]: 2025-10-07 21:57:28.846245791 +0000 UTC m=+0.080472747 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 21:57:28 compute-0 nova_compute[192716]: 2025-10-07 21:57:28.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:28 compute-0 nova_compute[192716]: 2025-10-07 21:57:28.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.500 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.502 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.740 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.741 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:57:29 compute-0 podman[203153]: time="2025-10-07T21:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:57:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:57:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.771 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.772 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5869MB free_disk=73.30620956420898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.773 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:29 compute-0 nova_compute[192716]: 2025-10-07 21:57:29.774 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:30 compute-0 nova_compute[192716]: 2025-10-07 21:57:30.838 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:57:30 compute-0 nova_compute[192716]: 2025-10-07 21:57:30.838 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:57:29 up  1:06,  0 user,  load average: 0.32, 0.36, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:57:30 compute-0 nova_compute[192716]: 2025-10-07 21:57:30.959 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: ERROR   21:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: ERROR   21:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: ERROR   21:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: ERROR   21:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: ERROR   21:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:57:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:57:31 compute-0 nova_compute[192716]: 2025-10-07 21:57:31.468 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:57:31 compute-0 nova_compute[192716]: 2025-10-07 21:57:31.980 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:57:31 compute-0 nova_compute[192716]: 2025-10-07 21:57:31.981 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.207s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:32 compute-0 nova_compute[192716]: 2025-10-07 21:57:32.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:32 compute-0 podman[218440]: 2025-10-07 21:57:32.869291355 +0000 UTC m=+0.097096830 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 07 21:57:34 compute-0 nova_compute[192716]: 2025-10-07 21:57:34.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:37 compute-0 nova_compute[192716]: 2025-10-07 21:57:37.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:39 compute-0 nova_compute[192716]: 2025-10-07 21:57:39.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:39 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 21:57:42 compute-0 nova_compute[192716]: 2025-10-07 21:57:42.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:44 compute-0 nova_compute[192716]: 2025-10-07 21:57:44.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:44 compute-0 sshd-session[218462]: Invalid user john from 103.115.24.11 port 52994
Oct 07 21:57:44 compute-0 sshd-session[218462]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 21:57:44 compute-0 sshd-session[218462]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 21:57:46 compute-0 sshd-session[218462]: Failed password for invalid user john from 103.115.24.11 port 52994 ssh2
Oct 07 21:57:46 compute-0 podman[218464]: 2025-10-07 21:57:46.830583141 +0000 UTC m=+0.062807102 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=iscsid, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:57:46 compute-0 podman[218465]: 2025-10-07 21:57:46.841374961 +0000 UTC m=+0.068867425 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 21:57:47 compute-0 nova_compute[192716]: 2025-10-07 21:57:47.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:47 compute-0 sshd-session[218462]: Received disconnect from 103.115.24.11 port 52994:11: Bye Bye [preauth]
Oct 07 21:57:47 compute-0 sshd-session[218462]: Disconnected from invalid user john 103.115.24.11 port 52994 [preauth]
Oct 07 21:57:49 compute-0 nova_compute[192716]: 2025-10-07 21:57:49.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:50 compute-0 nova_compute[192716]: 2025-10-07 21:57:50.805 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:50 compute-0 nova_compute[192716]: 2025-10-07 21:57:50.805 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:50 compute-0 podman[218503]: 2025-10-07 21:57:50.838621412 +0000 UTC m=+0.073412237 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:57:51 compute-0 nova_compute[192716]: 2025-10-07 21:57:51.311 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 21:57:51 compute-0 nova_compute[192716]: 2025-10-07 21:57:51.913 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:51 compute-0 nova_compute[192716]: 2025-10-07 21:57:51.913 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:51 compute-0 nova_compute[192716]: 2025-10-07 21:57:51.923 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 21:57:51 compute-0 nova_compute[192716]: 2025-10-07 21:57:51.924 2 INFO nova.compute.claims [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Claim successful on node compute-0.ctlplane.example.com
Oct 07 21:57:52 compute-0 nova_compute[192716]: 2025-10-07 21:57:52.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:53 compute-0 nova_compute[192716]: 2025-10-07 21:57:53.019 2 DEBUG nova.compute.provider_tree [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:57:53 compute-0 nova_compute[192716]: 2025-10-07 21:57:53.525 2 DEBUG nova.scheduler.client.report [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.038 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.038 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.563 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.564 2 DEBUG nova.network.neutron [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.564 2 WARNING neutronclient.v2_0.client [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:57:54 compute-0 nova_compute[192716]: 2025-10-07 21:57:54.565 2 WARNING neutronclient.v2_0.client [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:57:55 compute-0 nova_compute[192716]: 2025-10-07 21:57:55.084 2 INFO nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 21:57:55 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:55.235 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:57:55 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:57:55.235 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:57:55 compute-0 nova_compute[192716]: 2025-10-07 21:57:55.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:55 compute-0 nova_compute[192716]: 2025-10-07 21:57:55.497 2 DEBUG nova.network.neutron [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Successfully created port: 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 21:57:55 compute-0 nova_compute[192716]: 2025-10-07 21:57:55.593 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.168 2 DEBUG nova.network.neutron [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Successfully updated port: 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.252 2 DEBUG nova.compute.manager [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-changed-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.253 2 DEBUG nova.compute.manager [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Refreshing instance network info cache due to event network-changed-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.253 2 DEBUG oslo_concurrency.lockutils [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.254 2 DEBUG oslo_concurrency.lockutils [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.254 2 DEBUG nova.network.neutron [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Refreshing network info cache for port 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.617 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.619 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.620 2 INFO nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Creating image(s)
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.621 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.621 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.622 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.623 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.630 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.632 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.673 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.684 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.684 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.685 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.686 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.689 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.689 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.761 2 WARNING neutronclient.v2_0.client [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.770 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.771 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.807 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.809 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.810 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.884 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.885 2 DEBUG nova.virt.disk.api [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Checking if we can resize image /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.886 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.945 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.946 2 DEBUG nova.virt.disk.api [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Cannot resize image /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.947 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.948 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Ensure instance console log exists: /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.948 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.949 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:57:56 compute-0 nova_compute[192716]: 2025-10-07 21:57:56.949 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:57:57 compute-0 nova_compute[192716]: 2025-10-07 21:57:57.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:57 compute-0 nova_compute[192716]: 2025-10-07 21:57:57.854 2 DEBUG nova.network.neutron [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:57:57 compute-0 podman[218543]: 2025-10-07 21:57:57.873245831 +0000 UTC m=+0.108267181 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 21:57:58 compute-0 nova_compute[192716]: 2025-10-07 21:57:58.844 2 DEBUG nova.network.neutron [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:57:59 compute-0 nova_compute[192716]: 2025-10-07 21:57:59.357 2 DEBUG oslo_concurrency.lockutils [req-9f29623d-73c3-4202-967c-376206b7a866 req-ff6ea1b6-deda-46b2-b20b-b2c8b96312d1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:57:59 compute-0 nova_compute[192716]: 2025-10-07 21:57:59.358 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquired lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:57:59 compute-0 nova_compute[192716]: 2025-10-07 21:57:59.359 2 DEBUG nova.network.neutron [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 21:57:59 compute-0 nova_compute[192716]: 2025-10-07 21:57:59.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:57:59 compute-0 podman[203153]: time="2025-10-07T21:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:57:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:57:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 07 21:57:59 compute-0 podman[218570]: 2025-10-07 21:57:59.814088337 +0000 UTC m=+0.059979860 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 21:58:00 compute-0 nova_compute[192716]: 2025-10-07 21:58:00.861 2 DEBUG nova.network.neutron [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: ERROR   21:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: ERROR   21:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: ERROR   21:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: ERROR   21:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: ERROR   21:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:58:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:58:02 compute-0 nova_compute[192716]: 2025-10-07 21:58:02.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:02.236 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:02 compute-0 nova_compute[192716]: 2025-10-07 21:58:02.829 2 WARNING neutronclient.v2_0.client [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:58:03 compute-0 podman[218589]: 2025-10-07 21:58:03.830417948 +0000 UTC m=+0.071039469 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Oct 07 21:58:03 compute-0 nova_compute[192716]: 2025-10-07 21:58:03.874 2 DEBUG nova.network.neutron [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Updating instance_info_cache with network_info: [{"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.384 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Releasing lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.385 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Instance network_info: |[{"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.388 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Start _get_guest_xml network_info=[{"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.394 2 WARNING nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.396 2 DEBUG nova.virt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1485660204', uuid='9e0851d2-e80c-42d9-8197-540d52ac8500'), owner=OwnerMeta(userid='f98d2168fb30489d88896037aa86ab52', username='tempest-TestExecuteBasicStrategy-340226908-project-admin', projectid='2aab90dc44084cef89c9f41e873a0e5b', projectname='tempest-TestExecuteBasicStrategy-340226908'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874284.3964057) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.402 2 DEBUG nova.virt.libvirt.host [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.402 2 DEBUG nova.virt.libvirt.host [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.406 2 DEBUG nova.virt.libvirt.host [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.407 2 DEBUG nova.virt.libvirt.host [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.408 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.408 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.409 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.409 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.410 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.410 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.411 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.411 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.412 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.412 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.413 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.413 2 DEBUG nova.virt.hardware [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.420 2 DEBUG nova.virt.libvirt.vif [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:57:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1485660204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1485660204',id=11,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2aab90dc44084cef89c9f41e873a0e5b',ramdisk_id='',reservation_id='r-esrw4agk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-340226908',owner_user_name='tempest-TestExecuteBasicStrategy-340226908-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:57:55Z,user_data=None,user_id='f98d2168fb30489d88896037aa86ab52',uuid=9e0851d2-e80c-42d9-8197-540d52ac8500,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.421 2 DEBUG nova.network.os_vif_util [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Converting VIF {"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.422 2 DEBUG nova.network.os_vif_util [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.423 2 DEBUG nova.objects.instance [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e0851d2-e80c-42d9-8197-540d52ac8500 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.941 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] End _get_guest_xml xml=<domain type="kvm">
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <uuid>9e0851d2-e80c-42d9-8197-540d52ac8500</uuid>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <name>instance-0000000b</name>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1485660204</nova:name>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:58:04</nova:creationTime>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:58:04 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:58:04 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:user uuid="f98d2168fb30489d88896037aa86ab52">tempest-TestExecuteBasicStrategy-340226908-project-admin</nova:user>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:project uuid="2aab90dc44084cef89c9f41e873a0e5b">tempest-TestExecuteBasicStrategy-340226908</nova:project>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         <nova:port uuid="5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553">
Oct 07 21:58:04 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <system>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <entry name="serial">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <entry name="uuid">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </system>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <os>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </os>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <features>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </features>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:e4:5d:63"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <target dev="tap5e2e8adf-9b"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </interface>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <video>
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </video>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:58:04 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:58:04 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:58:04 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:58:04 compute-0 nova_compute[192716]: </domain>
Oct 07 21:58:04 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.943 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Preparing to wait for external event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.943 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.944 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.944 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.945 2 DEBUG nova.virt.libvirt.vif [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T21:57:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1485660204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1485660204',id=11,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2aab90dc44084cef89c9f41e873a0e5b',ramdisk_id='',reservation_id='r-esrw4agk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-340226908',owner_user_name='tempest-TestExecuteBasicStrategy-340226908-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T21:57:55Z,user_data=None,user_id='f98d2168fb30489d88896037aa86ab52',uuid=9e0851d2-e80c-42d9-8197-540d52ac8500,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.945 2 DEBUG nova.network.os_vif_util [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Converting VIF {"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.946 2 DEBUG nova.network.os_vif_util [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.946 2 DEBUG os_vif [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.947 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.948 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.948 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '22c843b0-d16c-58fd-9d99-039ae61e6e96', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.953 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e2e8adf-9b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.954 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5e2e8adf-9b, col_values=(('qos', UUID('01e1e1cb-88cd-498f-b7ea-f1abf67e464b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.954 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5e2e8adf-9b, col_values=(('external_ids', {'iface-id': '5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:5d:63', 'vm-uuid': '9e0851d2-e80c-42d9-8197-540d52ac8500'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 NetworkManager[51722]: <info>  [1759874284.9567] manager: (tap5e2e8adf-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:04 compute-0 nova_compute[192716]: 2025-10-07 21:58:04.963 2 INFO os_vif [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b')
Oct 07 21:58:06 compute-0 nova_compute[192716]: 2025-10-07 21:58:06.506 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:58:06 compute-0 nova_compute[192716]: 2025-10-07 21:58:06.507 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 21:58:06 compute-0 nova_compute[192716]: 2025-10-07 21:58:06.507 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] No VIF found with MAC fa:16:3e:e4:5d:63, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 21:58:06 compute-0 nova_compute[192716]: 2025-10-07 21:58:06.508 2 INFO nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Using config drive
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.032 2 WARNING neutronclient.v2_0.client [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.266 2 INFO nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Creating config drive at /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.278 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpky6jvqbj execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.412 2 DEBUG oslo_concurrency.processutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpky6jvqbj" returned: 0 in 0.135s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:58:07 compute-0 kernel: tap5e2e8adf-9b: entered promiscuous mode
Oct 07 21:58:07 compute-0 NetworkManager[51722]: <info>  [1759874287.5111] manager: (tap5e2e8adf-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 ovn_controller[94904]: 2025-10-07T21:58:07Z|00097|binding|INFO|Claiming lport 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for this chassis.
Oct 07 21:58:07 compute-0 ovn_controller[94904]: 2025-10-07T21:58:07Z|00098|binding|INFO|5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553: Claiming fa:16:3e:e4:5d:63 10.100.0.3
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.527 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:5d:63 10.100.0.3'], port_security=['fa:16:3e:e4:5d:63 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '9e0851d2-e80c-42d9-8197-540d52ac8500', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8350938-9140-4914-9fd9-576fae51c662', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2aab90dc44084cef89c9f41e873a0e5b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a7ceae96-e65d-4bad-a42e-cea1bb910773', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2ae172e-4471-427f-9271-ad3259dec3ad, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.529 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 in datapath b8350938-9140-4914-9fd9-576fae51c662 bound to our chassis
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.530 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8350938-9140-4914-9fd9-576fae51c662
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.544 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[250928bc-6d96-44b2-8646-68b297ca114c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.549 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8350938-91 in ovnmeta-b8350938-9140-4914-9fd9-576fae51c662 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 21:58:07 compute-0 systemd-udevd[218629]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.552 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8350938-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.552 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[929f5f85-3751-4055-9aa2-84e3622e621b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.553 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[be66a5d9-5d5b-4991-bd89-35d456dbf63c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.569 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[c48d429d-73e9-4e3f-8004-071b884bc6bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 NetworkManager[51722]: <info>  [1759874287.5778] device (tap5e2e8adf-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 21:58:07 compute-0 NetworkManager[51722]: <info>  [1759874287.5792] device (tap5e2e8adf-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.595 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f10c8d-6fba-491a-89ca-687dc3f3e201]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_controller[94904]: 2025-10-07T21:58:07Z|00099|binding|INFO|Setting lport 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 ovn-installed in OVS
Oct 07 21:58:07 compute-0 ovn_controller[94904]: 2025-10-07T21:58:07Z|00100|binding|INFO|Setting lport 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 up in Southbound
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 systemd-machined[152719]: New machine qemu-7-instance-0000000b.
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.638 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[01612cea-9b67-4fe6-b0ee-d26fb25747d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 NetworkManager[51722]: <info>  [1759874287.6464] manager: (tapb8350938-90): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.644 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5ad4b4-8033-498f-a93d-7968c58aa8e7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 systemd-udevd[218635]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.692 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[64ff1f52-7e1d-454f-b86e-94553c51b648]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.695 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[22835146-c011-42f4-bf55-b259b51a1ed5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 NetworkManager[51722]: <info>  [1759874287.7295] device (tapb8350938-90): carrier: link connected
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.738 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[0da9a27f-f59c-4ff4-8fa2-21ecd44479b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.761 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[895b35b7-4547-40c7-8d60-f1ddd3897bbb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8350938-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:0d:c9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402131, 'reachable_time': 36495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218664, 'error': None, 'target': 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.783 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2a510ba2-b8d1-4d0c-b504-6dfa66bd4b77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:dc9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402131, 'tstamp': 402131}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218665, 'error': None, 'target': 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.810 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcbcb2e-50ae-4344-8cf9-601053b146f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8350938-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:0d:c9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402131, 'reachable_time': 36495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218666, 'error': None, 'target': 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.853 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b35da04a-0774-4fc4-8699-45e8e63ef04c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.951 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[12567720-4074-4fa5-b4b2-feba6875f0cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.954 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8350938-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.954 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.955 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8350938-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 NetworkManager[51722]: <info>  [1759874287.9581] manager: (tapb8350938-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 07 21:58:07 compute-0 kernel: tapb8350938-90: entered promiscuous mode
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.962 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8350938-90, col_values=(('external_ids', {'iface-id': 'a135f816-2409-493b-84b9-25f42a19538e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 ovn_controller[94904]: 2025-10-07T21:58:07Z|00101|binding|INFO|Releasing lport a135f816-2409-493b-84b9-25f42a19538e from this chassis (sb_readonly=0)
Oct 07 21:58:07 compute-0 nova_compute[192716]: 2025-10-07 21:58:07.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.990 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8a780c2b-b3c9-4c78-ba27-7c78809cb5db]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.991 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.991 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.992 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b8350938-9140-4914-9fd9-576fae51c662 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.992 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.992 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9b85d5d0-f684-425b-82d1-592c2e9f5520]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.993 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.994 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8aa437-212b-4fad-8f61-0a411b2705e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.994 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: global
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-b8350938-9140-4914-9fd9-576fae51c662
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID b8350938-9140-4914-9fd9-576fae51c662
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 21:58:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:07.995 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'env', 'PROCESS_TAG=haproxy-b8350938-9140-4914-9fd9-576fae51c662', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8350938-9140-4914-9fd9-576fae51c662.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.003 2 DEBUG nova.compute.manager [req-b5c426dd-53b1-483b-b102-656c82f3cad5 req-c8f232e5-3a10-4505-a113-afb07499cc56 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.004 2 DEBUG oslo_concurrency.lockutils [req-b5c426dd-53b1-483b-b102-656c82f3cad5 req-c8f232e5-3a10-4505-a113-afb07499cc56 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.004 2 DEBUG oslo_concurrency.lockutils [req-b5c426dd-53b1-483b-b102-656c82f3cad5 req-c8f232e5-3a10-4505-a113-afb07499cc56 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.004 2 DEBUG oslo_concurrency.lockutils [req-b5c426dd-53b1-483b-b102-656c82f3cad5 req-c8f232e5-3a10-4505-a113-afb07499cc56 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.005 2 DEBUG nova.compute.manager [req-b5c426dd-53b1-483b-b102-656c82f3cad5 req-c8f232e5-3a10-4505-a113-afb07499cc56 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Processing event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 21:58:08 compute-0 podman[218705]: 2025-10-07 21:58:08.466125807 +0000 UTC m=+0.075515737 container create 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 07 21:58:08 compute-0 podman[218705]: 2025-10-07 21:58:08.422304374 +0000 UTC m=+0.031694364 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.512 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.516 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.521 2 INFO nova.virt.libvirt.driver [-] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Instance spawned successfully.
Oct 07 21:58:08 compute-0 nova_compute[192716]: 2025-10-07 21:58:08.522 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 21:58:08 compute-0 systemd[1]: Started libpod-conmon-2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c.scope.
Oct 07 21:58:08 compute-0 systemd[1]: Started libcrun container.
Oct 07 21:58:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eafe9f6c7aa5707faf15fbeb8d4f14187a5fc71710f9dfe67655ed482c1f8b8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 21:58:08 compute-0 podman[218705]: 2025-10-07 21:58:08.595526007 +0000 UTC m=+0.204915947 container init 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 21:58:08 compute-0 podman[218705]: 2025-10-07 21:58:08.602195739 +0000 UTC m=+0.211585639 container start 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251007)
Oct 07 21:58:08 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [NOTICE]   (218725) : New worker (218727) forked
Oct 07 21:58:08 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [NOTICE]   (218725) : Loading success.
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.047 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.048 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.048 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.049 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.049 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.049 2 DEBUG nova.virt.libvirt.driver [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.559 2 INFO nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Took 12.94 seconds to spawn the instance on the hypervisor.
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.560 2 DEBUG nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 21:58:09 compute-0 nova_compute[192716]: 2025-10-07 21:58:09.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.060 2 DEBUG nova.compute.manager [req-76381bdf-585b-478b-aaba-e2eb3e29dd10 req-7906db3b-cd10-4504-b412-d73c0858ca95 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.060 2 DEBUG oslo_concurrency.lockutils [req-76381bdf-585b-478b-aaba-e2eb3e29dd10 req-7906db3b-cd10-4504-b412-d73c0858ca95 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.061 2 DEBUG oslo_concurrency.lockutils [req-76381bdf-585b-478b-aaba-e2eb3e29dd10 req-7906db3b-cd10-4504-b412-d73c0858ca95 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.061 2 DEBUG oslo_concurrency.lockutils [req-76381bdf-585b-478b-aaba-e2eb3e29dd10 req-7906db3b-cd10-4504-b412-d73c0858ca95 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.062 2 DEBUG nova.compute.manager [req-76381bdf-585b-478b-aaba-e2eb3e29dd10 req-7906db3b-cd10-4504-b412-d73c0858ca95 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.062 2 WARNING nova.compute.manager [req-76381bdf-585b-478b-aaba-e2eb3e29dd10 req-7906db3b-cd10-4504-b412-d73c0858ca95 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received unexpected event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with vm_state active and task_state None.
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.097 2 INFO nova.compute.manager [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Took 18.27 seconds to build instance.
Oct 07 21:58:10 compute-0 nova_compute[192716]: 2025-10-07 21:58:10.603 2 DEBUG oslo_concurrency.lockutils [None req-ad54ec74-d490-4259-947c-328d91418b43 f98d2168fb30489d88896037aa86ab52 2aab90dc44084cef89c9f41e873a0e5b - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.798s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:14 compute-0 nova_compute[192716]: 2025-10-07 21:58:14.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:14 compute-0 nova_compute[192716]: 2025-10-07 21:58:14.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:17 compute-0 podman[218737]: 2025-10-07 21:58:17.829152161 +0000 UTC m=+0.066342233 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 21:58:17 compute-0 podman[218738]: 2025-10-07 21:58:17.833593879 +0000 UTC m=+0.070281477 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct 07 21:58:19 compute-0 nova_compute[192716]: 2025-10-07 21:58:19.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:19 compute-0 nova_compute[192716]: 2025-10-07 21:58:19.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:20 compute-0 ovn_controller[94904]: 2025-10-07T21:58:20Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:5d:63 10.100.0.3
Oct 07 21:58:20 compute-0 ovn_controller[94904]: 2025-10-07T21:58:20Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:5d:63 10.100.0.3
Oct 07 21:58:21 compute-0 podman[218794]: 2025-10-07 21:58:21.856382287 +0000 UTC m=+0.081581993 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 21:58:24 compute-0 nova_compute[192716]: 2025-10-07 21:58:24.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:24 compute-0 nova_compute[192716]: 2025-10-07 21:58:24.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:24 compute-0 nova_compute[192716]: 2025-10-07 21:58:24.981 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:24 compute-0 nova_compute[192716]: 2025-10-07 21:58:24.982 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:24 compute-0 nova_compute[192716]: 2025-10-07 21:58:24.982 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:58:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:25.619 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:25.619 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:25.620 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:25 compute-0 nova_compute[192716]: 2025-10-07 21:58:25.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:25 compute-0 nova_compute[192716]: 2025-10-07 21:58:25.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:27 compute-0 nova_compute[192716]: 2025-10-07 21:58:27.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:28 compute-0 podman[218820]: 2025-10-07 21:58:28.867384793 +0000 UTC m=+0.100769725 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 21:58:28 compute-0 nova_compute[192716]: 2025-10-07 21:58:28.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:28 compute-0 nova_compute[192716]: 2025-10-07 21:58:28.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:29 compute-0 nova_compute[192716]: 2025-10-07 21:58:29.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:29 compute-0 nova_compute[192716]: 2025-10-07 21:58:29.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:29 compute-0 nova_compute[192716]: 2025-10-07 21:58:29.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:29 compute-0 nova_compute[192716]: 2025-10-07 21:58:29.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:29 compute-0 nova_compute[192716]: 2025-10-07 21:58:29.505 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:58:29 compute-0 podman[203153]: time="2025-10-07T21:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:58:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:58:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3481 "" "Go-http-client/1.1"
Oct 07 21:58:29 compute-0 nova_compute[192716]: 2025-10-07 21:58:29.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:30 compute-0 nova_compute[192716]: 2025-10-07 21:58:30.556 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:58:30 compute-0 nova_compute[192716]: 2025-10-07 21:58:30.680 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:58:30 compute-0 nova_compute[192716]: 2025-10-07 21:58:30.682 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:58:30 compute-0 nova_compute[192716]: 2025-10-07 21:58:30.776 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:58:30 compute-0 podman[218851]: 2025-10-07 21:58:30.890011975 +0000 UTC m=+0.119836065 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 21:58:30 compute-0 nova_compute[192716]: 2025-10-07 21:58:30.996 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:58:30 compute-0 nova_compute[192716]: 2025-10-07 21:58:30.998 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:58:31 compute-0 nova_compute[192716]: 2025-10-07 21:58:31.022 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:58:31 compute-0 nova_compute[192716]: 2025-10-07 21:58:31.023 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5675MB free_disk=73.2769775390625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:58:31 compute-0 nova_compute[192716]: 2025-10-07 21:58:31.024 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:31 compute-0 nova_compute[192716]: 2025-10-07 21:58:31.024 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: ERROR   21:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: ERROR   21:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: ERROR   21:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: ERROR   21:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: ERROR   21:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:58:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:58:32 compute-0 nova_compute[192716]: 2025-10-07 21:58:32.089 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 9e0851d2-e80c-42d9-8197-540d52ac8500 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 21:58:32 compute-0 nova_compute[192716]: 2025-10-07 21:58:32.090 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:58:32 compute-0 nova_compute[192716]: 2025-10-07 21:58:32.090 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:58:31 up  1:07,  0 user,  load average: 0.25, 0.32, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_2aab90dc44084cef89c9f41e873a0e5b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:58:32 compute-0 nova_compute[192716]: 2025-10-07 21:58:32.134 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:58:32 compute-0 nova_compute[192716]: 2025-10-07 21:58:32.644 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:58:33 compute-0 nova_compute[192716]: 2025-10-07 21:58:33.153 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:58:33 compute-0 nova_compute[192716]: 2025-10-07 21:58:33.153 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.129s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:34 compute-0 nova_compute[192716]: 2025-10-07 21:58:34.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:34 compute-0 podman[218874]: 2025-10-07 21:58:34.828754129 +0000 UTC m=+0.072153080 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git)
Oct 07 21:58:34 compute-0 nova_compute[192716]: 2025-10-07 21:58:34.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:35 compute-0 nova_compute[192716]: 2025-10-07 21:58:35.153 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:58:35 compute-0 nova_compute[192716]: 2025-10-07 21:58:35.807 2 DEBUG nova.compute.manager [None req-820d52e0-432a-4685-b416-cd8a94d7950f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 07 21:58:35 compute-0 nova_compute[192716]: 2025-10-07 21:58:35.859 2 DEBUG nova.compute.provider_tree [None req-820d52e0-432a-4685-b416-cd8a94d7950f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 9 to 11 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 21:58:37 compute-0 ovn_controller[94904]: 2025-10-07T21:58:37Z|00102|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 07 21:58:39 compute-0 nova_compute[192716]: 2025-10-07 21:58:39.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:39 compute-0 nova_compute[192716]: 2025-10-07 21:58:39.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:44 compute-0 nova_compute[192716]: 2025-10-07 21:58:44.094 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Check if temp file /var/lib/nova/instances/tmpr7pqfh9r exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 07 21:58:44 compute-0 nova_compute[192716]: 2025-10-07 21:58:44.100 2 DEBUG nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr7pqfh9r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e0851d2-e80c-42d9-8197-540d52ac8500',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 07 21:58:44 compute-0 nova_compute[192716]: 2025-10-07 21:58:44.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:44 compute-0 nova_compute[192716]: 2025-10-07 21:58:44.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:48 compute-0 podman[218896]: 2025-10-07 21:58:48.832286202 +0000 UTC m=+0.067422414 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 21:58:48 compute-0 podman[218895]: 2025-10-07 21:58:48.864747707 +0000 UTC m=+0.099972112 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 07 21:58:49 compute-0 nova_compute[192716]: 2025-10-07 21:58:49.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:49 compute-0 nova_compute[192716]: 2025-10-07 21:58:49.974 2 DEBUG oslo_concurrency.processutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:58:49 compute-0 nova_compute[192716]: 2025-10-07 21:58:49.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.050 2 DEBUG oslo_concurrency.processutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.050 2 DEBUG oslo_concurrency.processutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.130 2 DEBUG oslo_concurrency.processutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.131 2 DEBUG nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Preparing to wait for external event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.131 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.131 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:50 compute-0 nova_compute[192716]: 2025-10-07 21:58:50.132 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:52 compute-0 podman[218940]: 2025-10-07 21:58:52.823738686 +0000 UTC m=+0.062707158 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 21:58:54 compute-0 nova_compute[192716]: 2025-10-07 21:58:54.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:54 compute-0 nova_compute[192716]: 2025-10-07 21:58:54.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:56.086 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:58:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:58:56.086 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.097 2 DEBUG nova.compute.manager [req-1350fbf6-8929-456c-9c2d-67bceb7be3e5 req-03ce4a8a-1132-4d16-b016-2f942ef84499 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.098 2 DEBUG oslo_concurrency.lockutils [req-1350fbf6-8929-456c-9c2d-67bceb7be3e5 req-03ce4a8a-1132-4d16-b016-2f942ef84499 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.098 2 DEBUG oslo_concurrency.lockutils [req-1350fbf6-8929-456c-9c2d-67bceb7be3e5 req-03ce4a8a-1132-4d16-b016-2f942ef84499 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.099 2 DEBUG oslo_concurrency.lockutils [req-1350fbf6-8929-456c-9c2d-67bceb7be3e5 req-03ce4a8a-1132-4d16-b016-2f942ef84499 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.099 2 DEBUG nova.compute.manager [req-1350fbf6-8929-456c-9c2d-67bceb7be3e5 req-03ce4a8a-1132-4d16-b016-2f942ef84499 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No event matching network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 in dict_keys([('network-vif-plugged', '5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 07 21:58:56 compute-0 nova_compute[192716]: 2025-10-07 21:58:56.100 2 DEBUG nova.compute.manager [req-1350fbf6-8929-456c-9c2d-67bceb7be3e5 req-03ce4a8a-1132-4d16-b016-2f942ef84499 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:58:57 compute-0 nova_compute[192716]: 2025-10-07 21:58:57.152 2 INFO nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Took 7.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.153 2 DEBUG nova.compute.manager [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.154 2 DEBUG oslo_concurrency.lockutils [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.155 2 DEBUG oslo_concurrency.lockutils [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.155 2 DEBUG oslo_concurrency.lockutils [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.156 2 DEBUG nova.compute.manager [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Processing event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.156 2 DEBUG nova.compute.manager [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-changed-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.157 2 DEBUG nova.compute.manager [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Refreshing instance network info cache due to event network-changed-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.157 2 DEBUG oslo_concurrency.lockutils [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.158 2 DEBUG oslo_concurrency.lockutils [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.158 2 DEBUG nova.network.neutron [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Refreshing network info cache for port 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.160 2 DEBUG nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.667 2 WARNING neutronclient.v2_0.client [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:58:58 compute-0 nova_compute[192716]: 2025-10-07 21:58:58.675 2 DEBUG nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr7pqfh9r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e0851d2-e80c-42d9-8197-540d52ac8500',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(32a1b330-f4a7-40f3-a59e-e43594d60c43),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.126 2 WARNING neutronclient.v2_0.client [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.194 2 DEBUG nova.objects.instance [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e0851d2-e80c-42d9-8197-540d52ac8500 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.196 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.198 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.199 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.302 2 DEBUG nova.network.neutron [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Updated VIF entry in instance network info cache for port 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.302 2 DEBUG nova.network.neutron [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Updating instance_info_cache with network_info: [{"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.701 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.701 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.710 2 DEBUG nova.virt.libvirt.vif [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:57:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1485660204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1485660204',id=11,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:58:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2aab90dc44084cef89c9f41e873a0e5b',ramdisk_id='',reservation_id='r-esrw4agk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-340226908',owner_user_name='tempest-TestExecuteBasicStrategy-340226908-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:58:09Z,user_data=None,user_id='f98d2168fb30489d88896037aa86ab52',uuid=9e0851d2-e80c-42d9-8197-540d52ac8500,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.710 2 DEBUG nova.network.os_vif_util [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.711 2 DEBUG nova.network.os_vif_util [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.712 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Updating guest XML with vif config: <interface type="ethernet">
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <mac address="fa:16:3e:e4:5d:63"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <model type="virtio"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <driver name="vhost" rx_queue_size="512"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <mtu size="1442"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <target dev="tap5e2e8adf-9b"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]: </interface>
Oct 07 21:58:59 compute-0 nova_compute[192716]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.712 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <name>instance-0000000b</name>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <uuid>9e0851d2-e80c-42d9-8197-540d52ac8500</uuid>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1485660204</nova:name>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:58:04</nova:creationTime>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:user uuid="f98d2168fb30489d88896037aa86ab52">tempest-TestExecuteBasicStrategy-340226908-project-admin</nova:user>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:project uuid="2aab90dc44084cef89c9f41e873a0e5b">tempest-TestExecuteBasicStrategy-340226908</nova:project>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:port uuid="5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553">
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <memory unit="KiB">131072</memory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <vcpu placement="static">1</vcpu>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <resource>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <partition>/machine</partition>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </resource>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <system>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="serial">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="uuid">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </system>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <os>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </os>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <features>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <vmcoreinfo state="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </features>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <cpu mode="host-model" check="partial">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_poweroff>destroy</on_poweroff>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_reboot>restart</on_reboot>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_crash>destroy</on_crash>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <readonly/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="1" port="0x10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="2" port="0x11"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="3" port="0x12"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="4" port="0x13"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="5" port="0x14"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="6" port="0x15"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="7" port="0x16"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="8" port="0x17"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="9" port="0x18"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="10" port="0x19"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="11" port="0x1a"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="12" port="0x1b"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="13" port="0x1c"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="14" port="0x1d"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="15" port="0x1e"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="16" port="0x1f"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="17" port="0x20"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="18" port="0x21"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="19" port="0x22"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="20" port="0x23"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="21" port="0x24"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="22" port="0x25"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="23" port="0x26"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="24" port="0x27"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="25" port="0x28"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-pci-bridge"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="sata" index="0">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <interface type="ethernet"><mac address="fa:16:3e:e4:5d:63"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5e2e8adf-9b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </interface><serial type="pty">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target type="isa-serial" port="0">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <model name="isa-serial"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </target>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <console type="pty">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target type="serial" port="0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </console>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="usb" bus="0" port="1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </input>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <input type="mouse" bus="ps2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <listen type="address" address="::"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <video>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model type="virtio" heads="1" primary="yes"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </video>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]: </domain>
Oct 07 21:58:59 compute-0 nova_compute[192716]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.714 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <name>instance-0000000b</name>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <uuid>9e0851d2-e80c-42d9-8197-540d52ac8500</uuid>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1485660204</nova:name>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:58:04</nova:creationTime>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:user uuid="f98d2168fb30489d88896037aa86ab52">tempest-TestExecuteBasicStrategy-340226908-project-admin</nova:user>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:project uuid="2aab90dc44084cef89c9f41e873a0e5b">tempest-TestExecuteBasicStrategy-340226908</nova:project>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:port uuid="5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553">
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <memory unit="KiB">131072</memory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <vcpu placement="static">1</vcpu>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <resource>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <partition>/machine</partition>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </resource>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <system>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="serial">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="uuid">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </system>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <os>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </os>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <features>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <vmcoreinfo state="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </features>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <cpu mode="host-model" check="partial">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_poweroff>destroy</on_poweroff>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_reboot>restart</on_reboot>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_crash>destroy</on_crash>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <readonly/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="1" port="0x10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="2" port="0x11"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="3" port="0x12"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="4" port="0x13"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="5" port="0x14"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="6" port="0x15"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="7" port="0x16"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="8" port="0x17"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="9" port="0x18"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="10" port="0x19"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="11" port="0x1a"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="12" port="0x1b"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="13" port="0x1c"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="14" port="0x1d"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="15" port="0x1e"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="16" port="0x1f"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="17" port="0x20"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="18" port="0x21"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="19" port="0x22"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="20" port="0x23"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="21" port="0x24"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="22" port="0x25"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="23" port="0x26"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="24" port="0x27"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="25" port="0x28"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-pci-bridge"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="sata" index="0">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <interface type="ethernet"><mac address="fa:16:3e:e4:5d:63"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5e2e8adf-9b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </interface><serial type="pty">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target type="isa-serial" port="0">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <model name="isa-serial"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </target>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <console type="pty">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target type="serial" port="0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </console>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="usb" bus="0" port="1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </input>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <input type="mouse" bus="ps2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <listen type="address" address="::"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <video>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model type="virtio" heads="1" primary="yes"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </video>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]: </domain>
Oct 07 21:58:59 compute-0 nova_compute[192716]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.717 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <name>instance-0000000b</name>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <uuid>9e0851d2-e80c-42d9-8197-540d52ac8500</uuid>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <metadata>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1485660204</nova:name>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 21:58:04</nova:creationTime>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:user uuid="f98d2168fb30489d88896037aa86ab52">tempest-TestExecuteBasicStrategy-340226908-project-admin</nova:user>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:project uuid="2aab90dc44084cef89c9f41e873a0e5b">tempest-TestExecuteBasicStrategy-340226908</nova:project>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <nova:port uuid="5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553">
Oct 07 21:58:59 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </metadata>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <memory unit="KiB">131072</memory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <vcpu placement="static">1</vcpu>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <resource>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <partition>/machine</partition>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </resource>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <system>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="serial">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="uuid">9e0851d2-e80c-42d9-8197-540d52ac8500</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </system>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <os>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </os>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <features>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <apic/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <vmcoreinfo state="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </features>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <cpu mode="host-model" check="partial">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </cpu>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </clock>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_poweroff>destroy</on_poweroff>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_reboot>restart</on_reboot>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <on_crash>destroy</on_crash>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <devices>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/disk.config"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <readonly/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </disk>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="1" port="0x10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="2" port="0x11"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="3" port="0x12"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="4" port="0x13"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="5" port="0x14"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="6" port="0x15"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="7" port="0x16"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="8" port="0x17"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="9" port="0x18"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="10" port="0x19"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="11" port="0x1a"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="12" port="0x1b"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="13" port="0x1c"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="14" port="0x1d"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="15" port="0x1e"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="16" port="0x1f"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="17" port="0x20"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="18" port="0x21"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="19" port="0x22"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="20" port="0x23"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="21" port="0x24"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="22" port="0x25"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="23" port="0x26"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="24" port="0x27"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target chassis="25" port="0x28"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model name="pcie-pci-bridge"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <controller type="sata" index="0">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </controller>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <interface type="ethernet"><mac address="fa:16:3e:e4:5d:63"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5e2e8adf-9b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </interface><serial type="pty">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target type="isa-serial" port="0">
Oct 07 21:58:59 compute-0 nova_compute[192716]:         <model name="isa-serial"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       </target>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </serial>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <console type="pty">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500/console.log" append="off"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <target type="serial" port="0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </console>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="usb" bus="0" port="1"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </input>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <input type="mouse" bus="ps2"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <listen type="address" address="::"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </graphics>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <video>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <model type="virtio" heads="1" primary="yes"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </video>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 21:58:59 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]:     </rng>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   </devices>
Oct 07 21:58:59 compute-0 nova_compute[192716]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 07 21:58:59 compute-0 nova_compute[192716]: </domain>
Oct 07 21:58:59 compute-0 nova_compute[192716]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.718 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 07 21:58:59 compute-0 podman[203153]: time="2025-10-07T21:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:58:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 21:58:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3483 "" "Go-http-client/1.1"
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.807 2 DEBUG oslo_concurrency.lockutils [req-c24a0808-ab76-4a5e-9804-09b49a8d5447 req-e6ef159d-c8e2-417a-968b-b4c9694e5137 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-9e0851d2-e80c-42d9-8197-540d52ac8500" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 21:58:59 compute-0 podman[218965]: 2025-10-07 21:58:59.925778356 +0000 UTC m=+0.156138681 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 07 21:58:59 compute-0 nova_compute[192716]: 2025-10-07 21:58:59.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:00 compute-0 nova_compute[192716]: 2025-10-07 21:59:00.205 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 21:59:00 compute-0 nova_compute[192716]: 2025-10-07 21:59:00.205 2 INFO nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 07 21:59:01 compute-0 nova_compute[192716]: 2025-10-07 21:59:01.222 2 INFO nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: ERROR   21:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: ERROR   21:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: ERROR   21:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: ERROR   21:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: ERROR   21:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:59:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:59:01 compute-0 nova_compute[192716]: 2025-10-07 21:59:01.725 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 21:59:01 compute-0 nova_compute[192716]: 2025-10-07 21:59:01.726 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 07 21:59:01 compute-0 podman[218991]: 2025-10-07 21:59:01.853782362 +0000 UTC m=+0.082321324 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 21:59:02 compute-0 nova_compute[192716]: 2025-10-07 21:59:02.228 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 21:59:02 compute-0 nova_compute[192716]: 2025-10-07 21:59:02.229 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 07 21:59:02 compute-0 nova_compute[192716]: 2025-10-07 21:59:02.734 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 21:59:02 compute-0 nova_compute[192716]: 2025-10-07 21:59:02.734 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.087 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.256 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.257 2 DEBUG nova.virt.libvirt.migration [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 07 21:59:03 compute-0 kernel: tap5e2e8adf-9b (unregistering): left promiscuous mode
Oct 07 21:59:03 compute-0 NetworkManager[51722]: <info>  [1759874343.4869] device (tap5e2e8adf-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 21:59:03 compute-0 ovn_controller[94904]: 2025-10-07T21:59:03Z|00103|binding|INFO|Releasing lport 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 from this chassis (sb_readonly=0)
Oct 07 21:59:03 compute-0 ovn_controller[94904]: 2025-10-07T21:59:03Z|00104|binding|INFO|Setting lport 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 down in Southbound
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 ovn_controller[94904]: 2025-10-07T21:59:03Z|00105|binding|INFO|Removing iface tap5e2e8adf-9b ovn-installed in OVS
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.510 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:5d:63 10.100.0.3'], port_security=['fa:16:3e:e4:5d:63 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '89a2e214-6e2f-462a-b578-1487fac3513c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '9e0851d2-e80c-42d9-8197-540d52ac8500', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8350938-9140-4914-9fd9-576fae51c662', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2aab90dc44084cef89c9f41e873a0e5b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a7ceae96-e65d-4bad-a42e-cea1bb910773', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2ae172e-4471-427f-9271-ad3259dec3ad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.511 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 in datapath b8350938-9140-4914-9fd9-576fae51c662 unbound from our chassis
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.512 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8350938-9140-4914-9fd9-576fae51c662, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.515 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c11522-f7b4-409e-b866-b6c35201851b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.516 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8350938-9140-4914-9fd9-576fae51c662 namespace which is not needed anymore
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 07 21:59:03 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 14.672s CPU time.
Oct 07 21:59:03 compute-0 systemd-machined[152719]: Machine qemu-7-instance-0000000b terminated.
Oct 07 21:59:03 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [NOTICE]   (218725) : haproxy version is 3.0.5-8e879a5
Oct 07 21:59:03 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [NOTICE]   (218725) : path to executable is /usr/sbin/haproxy
Oct 07 21:59:03 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [WARNING]  (218725) : Exiting Master process...
Oct 07 21:59:03 compute-0 podman[219051]: 2025-10-07 21:59:03.671130288 +0000 UTC m=+0.039954053 container kill 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 21:59:03 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [ALERT]    (218725) : Current worker (218727) exited with code 143 (Terminated)
Oct 07 21:59:03 compute-0 neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662[218721]: [WARNING]  (218725) : All workers exited. Exiting... (0)
Oct 07 21:59:03 compute-0 systemd[1]: libpod-2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c.scope: Deactivated successfully.
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.688 2 DEBUG nova.compute.manager [req-5b9cbb9e-47f2-404f-825f-cc7c74f06f9d req-947d3d81-7aab-48d6-bf0b-5619355a781d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.689 2 DEBUG oslo_concurrency.lockutils [req-5b9cbb9e-47f2-404f-825f-cc7c74f06f9d req-947d3d81-7aab-48d6-bf0b-5619355a781d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.690 2 DEBUG oslo_concurrency.lockutils [req-5b9cbb9e-47f2-404f-825f-cc7c74f06f9d req-947d3d81-7aab-48d6-bf0b-5619355a781d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.690 2 DEBUG oslo_concurrency.lockutils [req-5b9cbb9e-47f2-404f-825f-cc7c74f06f9d req-947d3d81-7aab-48d6-bf0b-5619355a781d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.692 2 DEBUG nova.compute.manager [req-5b9cbb9e-47f2-404f-825f-cc7c74f06f9d req-947d3d81-7aab-48d6-bf0b-5619355a781d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.692 2 DEBUG nova.compute.manager [req-5b9cbb9e-47f2-404f-825f-cc7c74f06f9d req-947d3d81-7aab-48d6-bf0b-5619355a781d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.738 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.738 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.738 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 07 21:59:03 compute-0 podman[219070]: 2025-10-07 21:59:03.748077075 +0000 UTC m=+0.048402966 container died 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.760 2 DEBUG nova.virt.libvirt.guest [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '9e0851d2-e80c-42d9-8197-540d52ac8500' (instance-0000000b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.761 2 INFO nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migration operation has completed
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.762 2 INFO nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] _post_live_migration() is started..
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.773 2 WARNING neutronclient.v2_0.client [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.774 2 WARNING neutronclient.v2_0.client [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 21:59:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c-userdata-shm.mount: Deactivated successfully.
Oct 07 21:59:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-7eafe9f6c7aa5707faf15fbeb8d4f14187a5fc71710f9dfe67655ed482c1f8b8-merged.mount: Deactivated successfully.
Oct 07 21:59:03 compute-0 podman[219070]: 2025-10-07 21:59:03.803667037 +0000 UTC m=+0.103992848 container cleanup 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 21:59:03 compute-0 systemd[1]: libpod-conmon-2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c.scope: Deactivated successfully.
Oct 07 21:59:03 compute-0 podman[219076]: 2025-10-07 21:59:03.82319933 +0000 UTC m=+0.104888674 container remove 2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.830 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f33e17-de81-469c-95c0-796bdda7b36a]: (4, ("Tue Oct  7 09:59:03 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662 (2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c)\n2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c\nTue Oct  7 09:59:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b8350938-9140-4914-9fd9-576fae51c662 (2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c)\n2d2fc71f77ef4e1d973a3a82c47beb315311e5279f42b702dda2274a22bdcb6c\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.832 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[20588863-23a7-4cdb-9d14-bd1630d783bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.833 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8350938-9140-4914-9fd9-576fae51c662.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.834 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c585e927-8d6a-4a64-b0b4-f666a21daf3b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.836 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8350938-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 kernel: tapb8350938-90: left promiscuous mode
Oct 07 21:59:03 compute-0 nova_compute[192716]: 2025-10-07 21:59:03.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.859 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9bdb59d1-400b-4c1f-a0ab-6dfee5feef2e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.886 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[413a7642-eb44-4f82-aabc-1510aeddf594]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.887 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb17c57-d79f-4e3f-bd59-87cdd626a54e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.908 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5575c3f2-ee62-48ba-b431-56dfdc406383]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402120, 'reachable_time': 23730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219113, 'error': None, 'target': 'ovnmeta-b8350938-9140-4914-9fd9-576fae51c662', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.911 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8350938-9140-4914-9fd9-576fae51c662 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 21:59:03 compute-0 systemd[1]: run-netns-ovnmeta\x2db8350938\x2d9140\x2d4914\x2d9fd9\x2d576fae51c662.mount: Deactivated successfully.
Oct 07 21:59:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:03.912 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd78f2b-a38d-45d2-a147-8394205d6933]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.688 2 DEBUG nova.network.neutron [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Activated binding for port 5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.689 2 DEBUG nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.689 2 DEBUG nova.virt.libvirt.vif [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T21:57:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1485660204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1485660204',id=11,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T21:58:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2aab90dc44084cef89c9f41e873a0e5b',ramdisk_id='',reservation_id='r-esrw4agk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-340226908',owner_user_name='tempest-TestExecuteBasicStrategy-340226908-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T21:58:38Z,user_data=None,user_id='f98d2168fb30489d88896037aa86ab52',uuid=9e0851d2-e80c-42d9-8197-540d52ac8500,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.690 2 DEBUG nova.network.os_vif_util [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "address": "fa:16:3e:e4:5d:63", "network": {"id": "b8350938-9140-4914-9fd9-576fae51c662", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1009648925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a049bf0f330a49e7aa11cf49f3632f49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e2e8adf-9b", "ovs_interfaceid": "5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.690 2 DEBUG nova.network.os_vif_util [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.691 2 DEBUG os_vif [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e2e8adf-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.696 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=01e1e1cb-88cd-498f-b7ea-f1abf67e464b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.700 2 INFO os_vif [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:5d:63,bridge_name='br-int',has_traffic_filtering=True,id=5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553,network=Network(b8350938-9140-4914-9fd9-576fae51c662),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e2e8adf-9b')
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.700 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.701 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.701 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.701 2 DEBUG nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.702 2 INFO nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Deleting instance files /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500_del
Oct 07 21:59:04 compute-0 nova_compute[192716]: 2025-10-07 21:59:04.703 2 INFO nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Deletion of /var/lib/nova/instances/9e0851d2-e80c-42d9-8197-540d52ac8500_del complete
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.767 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.768 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.768 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.769 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.769 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.769 2 WARNING nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received unexpected event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with vm_state active and task_state migrating.
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.770 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.770 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.770 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.771 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.771 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.771 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.771 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.772 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.772 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.772 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.773 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.773 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-unplugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.773 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.774 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.774 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.774 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.775 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.775 2 WARNING nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received unexpected event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with vm_state active and task_state migrating.
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.775 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.775 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.776 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.776 2 DEBUG oslo_concurrency.lockutils [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.777 2 DEBUG nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] No waiting events found dispatching network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 21:59:05 compute-0 nova_compute[192716]: 2025-10-07 21:59:05.777 2 WARNING nova.compute.manager [req-b6ccba53-2ceb-49ba-8925-79fc823c2ba3 req-69d2b623-c93a-452b-83ee-991e16fe6a90 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Received unexpected event network-vif-plugged-5e2e8adf-9b38-4e50-bd3b-d0e20b3ac553 for instance with vm_state active and task_state migrating.
Oct 07 21:59:05 compute-0 podman[219115]: 2025-10-07 21:59:05.838779219 +0000 UTC m=+0.075652291 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter)
Oct 07 21:59:09 compute-0 nova_compute[192716]: 2025-10-07 21:59:09.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:09 compute-0 nova_compute[192716]: 2025-10-07 21:59:09.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:14 compute-0 nova_compute[192716]: 2025-10-07 21:59:14.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:14 compute-0 nova_compute[192716]: 2025-10-07 21:59:14.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:14 compute-0 nova_compute[192716]: 2025-10-07 21:59:14.741 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:14 compute-0 nova_compute[192716]: 2025-10-07 21:59:14.741 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:14 compute-0 nova_compute[192716]: 2025-10-07 21:59:14.742 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "9e0851d2-e80c-42d9-8197-540d52ac8500-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.267 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.268 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.268 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.268 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.511 2 WARNING nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.513 2 DEBUG oslo_concurrency.processutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.539 2 DEBUG oslo_concurrency.processutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.540 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5847MB free_disk=73.30619430541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.541 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:15 compute-0 nova_compute[192716]: 2025-10-07 21:59:15.541 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:16 compute-0 nova_compute[192716]: 2025-10-07 21:59:16.560 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Migration for instance 9e0851d2-e80c-42d9-8197-540d52ac8500 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 21:59:17 compute-0 nova_compute[192716]: 2025-10-07 21:59:17.069 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 07 21:59:17 compute-0 nova_compute[192716]: 2025-10-07 21:59:17.118 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Migration 32a1b330-f4a7-40f3-a59e-e43594d60c43 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 07 21:59:17 compute-0 nova_compute[192716]: 2025-10-07 21:59:17.119 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:59:17 compute-0 nova_compute[192716]: 2025-10-07 21:59:17.119 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:59:15 up  1:08,  0 user,  load average: 0.12, 0.28, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:59:17 compute-0 nova_compute[192716]: 2025-10-07 21:59:17.211 2 DEBUG nova.compute.provider_tree [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:59:17 compute-0 nova_compute[192716]: 2025-10-07 21:59:17.720 2 DEBUG nova.scheduler.client.report [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:59:18 compute-0 nova_compute[192716]: 2025-10-07 21:59:18.232 2 DEBUG nova.compute.resource_tracker [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:59:18 compute-0 nova_compute[192716]: 2025-10-07 21:59:18.233 2 DEBUG oslo_concurrency.lockutils [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.692s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:18 compute-0 nova_compute[192716]: 2025-10-07 21:59:18.254 2 INFO nova.compute.manager [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 07 21:59:19 compute-0 nova_compute[192716]: 2025-10-07 21:59:19.316 2 INFO nova.scheduler.client.report [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Deleted allocation for migration 32a1b330-f4a7-40f3-a59e-e43594d60c43
Oct 07 21:59:19 compute-0 nova_compute[192716]: 2025-10-07 21:59:19.316 2 DEBUG nova.virt.libvirt.driver [None req-1cabb0b8-aab3-4115-9134-7fe4741e800d 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 9e0851d2-e80c-42d9-8197-540d52ac8500] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 07 21:59:19 compute-0 nova_compute[192716]: 2025-10-07 21:59:19.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:19 compute-0 nova_compute[192716]: 2025-10-07 21:59:19.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:19 compute-0 podman[219141]: 2025-10-07 21:59:19.854364108 +0000 UTC m=+0.083522378 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 07 21:59:19 compute-0 podman[219142]: 2025-10-07 21:59:19.869838874 +0000 UTC m=+0.098850179 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 21:59:21 compute-0 nova_compute[192716]: 2025-10-07 21:59:21.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:23 compute-0 podman[219183]: 2025-10-07 21:59:23.848658603 +0000 UTC m=+0.077884785 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 21:59:23 compute-0 nova_compute[192716]: 2025-10-07 21:59:23.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:23 compute-0 nova_compute[192716]: 2025-10-07 21:59:23.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 21:59:24 compute-0 nova_compute[192716]: 2025-10-07 21:59:24.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:24 compute-0 nova_compute[192716]: 2025-10-07 21:59:24.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:25.621 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:25.621 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:25.621 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:25 compute-0 nova_compute[192716]: 2025-10-07 21:59:25.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:26 compute-0 nova_compute[192716]: 2025-10-07 21:59:26.987 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:29 compute-0 nova_compute[192716]: 2025-10-07 21:59:29.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:29 compute-0 nova_compute[192716]: 2025-10-07 21:59:29.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:29 compute-0 podman[203153]: time="2025-10-07T21:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:59:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:59:29 compute-0 podman[203153]: @ - - [07/Oct/2025:21:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 07 21:59:29 compute-0 nova_compute[192716]: 2025-10-07 21:59:29.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:29 compute-0 nova_compute[192716]: 2025-10-07 21:59:29.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.507 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.687 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.689 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.708 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.709 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5858MB free_disk=73.30619430541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.710 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 21:59:30 compute-0 nova_compute[192716]: 2025-10-07 21:59:30.710 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 21:59:30 compute-0 podman[219209]: 2025-10-07 21:59:30.879180723 +0000 UTC m=+0.121802891 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: ERROR   21:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: ERROR   21:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: ERROR   21:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: ERROR   21:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: ERROR   21:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 21:59:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 21:59:31 compute-0 nova_compute[192716]: 2025-10-07 21:59:31.773 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 21:59:31 compute-0 nova_compute[192716]: 2025-10-07 21:59:31.773 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:59:30 up  1:08,  0 user,  load average: 0.09, 0.26, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 21:59:31 compute-0 nova_compute[192716]: 2025-10-07 21:59:31.803 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 21:59:32 compute-0 nova_compute[192716]: 2025-10-07 21:59:32.310 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 21:59:32 compute-0 nova_compute[192716]: 2025-10-07 21:59:32.822 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 21:59:32 compute-0 nova_compute[192716]: 2025-10-07 21:59:32.823 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 21:59:32 compute-0 podman[219235]: 2025-10-07 21:59:32.845945566 +0000 UTC m=+0.081331195 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 21:59:33 compute-0 nova_compute[192716]: 2025-10-07 21:59:33.727 2 DEBUG nova.compute.manager [None req-3e75c090-6016-483b-8885-4d91301e2245 2c71b2f9f101437eaf6e12c33825a1df 293ff4341f3d48a4ae100bf4fc7b99bd - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 07 21:59:33 compute-0 nova_compute[192716]: 2025-10-07 21:59:33.797 2 DEBUG nova.compute.provider_tree [None req-3e75c090-6016-483b-8885-4d91301e2245 2c71b2f9f101437eaf6e12c33825a1df 293ff4341f3d48a4ae100bf4fc7b99bd - - default default] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 11 to 14 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 21:59:33 compute-0 nova_compute[192716]: 2025-10-07 21:59:33.818 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:34 compute-0 nova_compute[192716]: 2025-10-07 21:59:34.327 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:34 compute-0 nova_compute[192716]: 2025-10-07 21:59:34.328 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 21:59:34 compute-0 nova_compute[192716]: 2025-10-07 21:59:34.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:34 compute-0 nova_compute[192716]: 2025-10-07 21:59:34.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:36 compute-0 podman[219254]: 2025-10-07 21:59:36.86653674 +0000 UTC m=+0.094494335 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible)
Oct 07 21:59:39 compute-0 nova_compute[192716]: 2025-10-07 21:59:39.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:39 compute-0 nova_compute[192716]: 2025-10-07 21:59:39.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:39 compute-0 nova_compute[192716]: 2025-10-07 21:59:39.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:44 compute-0 nova_compute[192716]: 2025-10-07 21:59:44.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:44 compute-0 nova_compute[192716]: 2025-10-07 21:59:44.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:49 compute-0 nova_compute[192716]: 2025-10-07 21:59:49.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:49 compute-0 nova_compute[192716]: 2025-10-07 21:59:49.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:50 compute-0 podman[219275]: 2025-10-07 21:59:50.838635525 +0000 UTC m=+0.081162740 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 07 21:59:50 compute-0 podman[219276]: 2025-10-07 21:59:50.859126975 +0000 UTC m=+0.089493550 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 21:59:54 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:54.034 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:ce:1d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-38575713-cfe6-4363-b8f2-d519a1658c28', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38575713-cfe6-4363-b8f2-d519a1658c28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '506f61dd7d8f4210a8fee10fc76cdd21', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95c850f1-5b47-4230-8393-cdaa5c318e81, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a9c6ebd1-60db-4495-9d46-56eded81811b) old=Port_Binding(mac=['fa:16:3e:b2:ce:1d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-38575713-cfe6-4363-b8f2-d519a1658c28', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38575713-cfe6-4363-b8f2-d519a1658c28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '506f61dd7d8f4210a8fee10fc76cdd21', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:59:54 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:54.036 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a9c6ebd1-60db-4495-9d46-56eded81811b in datapath 38575713-cfe6-4363-b8f2-d519a1658c28 updated
Oct 07 21:59:54 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:54.037 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38575713-cfe6-4363-b8f2-d519a1658c28, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 21:59:54 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:54.038 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c884ce3b-9bc7-49c4-bc46-17c12036ae99]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 21:59:54 compute-0 nova_compute[192716]: 2025-10-07 21:59:54.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:54 compute-0 nova_compute[192716]: 2025-10-07 21:59:54.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:54 compute-0 podman[219317]: 2025-10-07 21:59:54.827125673 +0000 UTC m=+0.066149568 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 21:59:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:57.796 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 21:59:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 21:59:57.797 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 21:59:57 compute-0 nova_compute[192716]: 2025-10-07 21:59:57.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:59 compute-0 nova_compute[192716]: 2025-10-07 21:59:59.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:59 compute-0 podman[203153]: time="2025-10-07T21:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 21:59:59 compute-0 nova_compute[192716]: 2025-10-07 21:59:59.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 21:59:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 21:59:59 compute-0 podman[203153]: @ - - [07/Oct/2025:21:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: ERROR   22:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: ERROR   22:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: ERROR   22:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: ERROR   22:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: ERROR   22:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:00:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:00:01 compute-0 podman[219345]: 2025-10-07 22:00:01.85410082 +0000 UTC m=+0.100403645 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 22:00:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:03.338 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:c6:a8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-66117a56-99ff-401d-a8e6-7a1dfcc961af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66117a56-99ff-401d-a8e6-7a1dfcc961af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8b6357569d04b338b9abbbbd98992e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68f17116-68fa-48de-9250-a9896caf4dcc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c5778659-e84f-4261-94de-4e2de8670aed) old=Port_Binding(mac=['fa:16:3e:c1:c6:a8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-66117a56-99ff-401d-a8e6-7a1dfcc961af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66117a56-99ff-401d-a8e6-7a1dfcc961af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8b6357569d04b338b9abbbbd98992e3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:00:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:03.339 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c5778659-e84f-4261-94de-4e2de8670aed in datapath 66117a56-99ff-401d-a8e6-7a1dfcc961af updated
Oct 07 22:00:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:03.339 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66117a56-99ff-401d-a8e6-7a1dfcc961af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:00:03 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:03.340 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[953ae515-a03e-4b41-b990-5df827a7941c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:00:03 compute-0 podman[219372]: 2025-10-07 22:00:03.823137047 +0000 UTC m=+0.058320152 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 22:00:04 compute-0 nova_compute[192716]: 2025-10-07 22:00:04.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:04 compute-0 nova_compute[192716]: 2025-10-07 22:00:04.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:05.798 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:00:07 compute-0 podman[219391]: 2025-10-07 22:00:07.847298114 +0000 UTC m=+0.081289273 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9)
Oct 07 22:00:09 compute-0 nova_compute[192716]: 2025-10-07 22:00:09.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:09 compute-0 nova_compute[192716]: 2025-10-07 22:00:09.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:14 compute-0 nova_compute[192716]: 2025-10-07 22:00:14.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:14 compute-0 nova_compute[192716]: 2025-10-07 22:00:14.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:15 compute-0 ovn_controller[94904]: 2025-10-07T22:00:15Z|00106|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 07 22:00:19 compute-0 nova_compute[192716]: 2025-10-07 22:00:19.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:19 compute-0 nova_compute[192716]: 2025-10-07 22:00:19.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:21 compute-0 podman[219415]: 2025-10-07 22:00:21.851125286 +0000 UTC m=+0.082316203 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 07 22:00:21 compute-0 podman[219414]: 2025-10-07 22:00:21.85957332 +0000 UTC m=+0.095359640 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:00:23 compute-0 nova_compute[192716]: 2025-10-07 22:00:23.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:23 compute-0 nova_compute[192716]: 2025-10-07 22:00:23.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:23 compute-0 nova_compute[192716]: 2025-10-07 22:00:23.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:00:24 compute-0 nova_compute[192716]: 2025-10-07 22:00:24.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:24 compute-0 nova_compute[192716]: 2025-10-07 22:00:24.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:25.622 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:00:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:25.623 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:00:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:25.623 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:00:25 compute-0 podman[219454]: 2025-10-07 22:00:25.873308066 +0000 UTC m=+0.110057533 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:00:27 compute-0 nova_compute[192716]: 2025-10-07 22:00:27.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:27 compute-0 nova_compute[192716]: 2025-10-07 22:00:27.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:29 compute-0 nova_compute[192716]: 2025-10-07 22:00:29.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:29 compute-0 podman[203153]: time="2025-10-07T22:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:00:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:00:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 07 22:00:29 compute-0 nova_compute[192716]: 2025-10-07 22:00:29.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: ERROR   22:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: ERROR   22:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: ERROR   22:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: ERROR   22:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: ERROR   22:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:00:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:00:31 compute-0 nova_compute[192716]: 2025-10-07 22:00:31.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:31 compute-0 nova_compute[192716]: 2025-10-07 22:00:31.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:31 compute-0 nova_compute[192716]: 2025-10-07 22:00:31.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.512 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.514 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.514 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.514 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.734 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.736 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.772 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.775 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5870MB free_disk=73.30619430541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.775 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:00:32 compute-0 nova_compute[192716]: 2025-10-07 22:00:32.776 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:00:32 compute-0 podman[219478]: 2025-10-07 22:00:32.902820126 +0000 UTC m=+0.138815932 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 22:00:33 compute-0 nova_compute[192716]: 2025-10-07 22:00:33.838 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:00:33 compute-0 nova_compute[192716]: 2025-10-07 22:00:33.839 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:00:32 up  1:09,  0 user,  load average: 0.03, 0.21, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:00:33 compute-0 nova_compute[192716]: 2025-10-07 22:00:33.864 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:00:34 compute-0 nova_compute[192716]: 2025-10-07 22:00:34.373 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:00:34 compute-0 nova_compute[192716]: 2025-10-07 22:00:34.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:34 compute-0 nova_compute[192716]: 2025-10-07 22:00:34.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:34 compute-0 podman[219505]: 2025-10-07 22:00:34.85226671 +0000 UTC m=+0.085153055 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:00:34 compute-0 nova_compute[192716]: 2025-10-07 22:00:34.882 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:00:34 compute-0 nova_compute[192716]: 2025-10-07 22:00:34.882 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:00:35 compute-0 nova_compute[192716]: 2025-10-07 22:00:35.882 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:00:38 compute-0 podman[219525]: 2025-10-07 22:00:38.87554191 +0000 UTC m=+0.110369552 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 07 22:00:39 compute-0 nova_compute[192716]: 2025-10-07 22:00:39.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:39 compute-0 nova_compute[192716]: 2025-10-07 22:00:39.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:39.950 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:b2:2e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'df550d234d364e7fb20f9ac88392be8a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b6dfddd4-019f-4508-ab9b-37759605366f) old=Port_Binding(mac=['fa:16:3e:af:b2:2e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'df550d234d364e7fb20f9ac88392be8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:00:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:39.951 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b6dfddd4-019f-4508-ab9b-37759605366f in datapath 726154fe-bda6-431d-b983-7caa973a9e17 updated
Oct 07 22:00:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:39.952 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 726154fe-bda6-431d-b983-7caa973a9e17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:00:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:39.953 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[de77f67c-9ed1-49ea-86a0-7f0289d0eb13]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:00:44 compute-0 nova_compute[192716]: 2025-10-07 22:00:44.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:44 compute-0 nova_compute[192716]: 2025-10-07 22:00:44.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:49 compute-0 nova_compute[192716]: 2025-10-07 22:00:49.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:49 compute-0 nova_compute[192716]: 2025-10-07 22:00:49.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:51.994 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:3a:28 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6eece3a6-4e06-448c-a072-8f80e11b730b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6eece3a6-4e06-448c-a072-8f80e11b730b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8083f874-d801-4e7b-8b92-5842e3e2f7a4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79057579-f95e-4ead-90eb-3cc0b257c512) old=Port_Binding(mac=['fa:16:3e:82:3a:28'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-6eece3a6-4e06-448c-a072-8f80e11b730b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6eece3a6-4e06-448c-a072-8f80e11b730b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:00:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:51.996 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79057579-f95e-4ead-90eb-3cc0b257c512 in datapath 6eece3a6-4e06-448c-a072-8f80e11b730b updated
Oct 07 22:00:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:51.996 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6eece3a6-4e06-448c-a072-8f80e11b730b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:00:51 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:00:51.997 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6564c2c4-df8c-4e13-aace-a29f028d208b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:00:52 compute-0 podman[219546]: 2025-10-07 22:00:52.855372162 +0000 UTC m=+0.089367176 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 22:00:52 compute-0 podman[219547]: 2025-10-07 22:00:52.855045253 +0000 UTC m=+0.083934990 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct 07 22:00:54 compute-0 nova_compute[192716]: 2025-10-07 22:00:54.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:54 compute-0 nova_compute[192716]: 2025-10-07 22:00:54.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:56 compute-0 podman[219584]: 2025-10-07 22:00:56.847359161 +0000 UTC m=+0.072085459 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:00:59 compute-0 nova_compute[192716]: 2025-10-07 22:00:59.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:00:59 compute-0 podman[203153]: time="2025-10-07T22:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:00:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:00:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 07 22:00:59 compute-0 nova_compute[192716]: 2025-10-07 22:00:59.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: ERROR   22:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: ERROR   22:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: ERROR   22:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: ERROR   22:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: ERROR   22:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:01:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:01:01 compute-0 CROND[219610]: (root) CMD (run-parts /etc/cron.hourly)
Oct 07 22:01:01 compute-0 run-parts[219613]: (/etc/cron.hourly) starting 0anacron
Oct 07 22:01:01 compute-0 run-parts[219619]: (/etc/cron.hourly) finished 0anacron
Oct 07 22:01:01 compute-0 CROND[219609]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 07 22:01:03 compute-0 podman[219620]: 2025-10-07 22:01:03.892932575 +0000 UTC m=+0.124003175 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct 07 22:01:04 compute-0 nova_compute[192716]: 2025-10-07 22:01:04.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:04 compute-0 nova_compute[192716]: 2025-10-07 22:01:04.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:05 compute-0 podman[219646]: 2025-10-07 22:01:05.852226202 +0000 UTC m=+0.084550467 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:01:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:07.826 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:01:07 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:07.859 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:01:07 compute-0 nova_compute[192716]: 2025-10-07 22:01:07.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:09 compute-0 nova_compute[192716]: 2025-10-07 22:01:09.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:09 compute-0 nova_compute[192716]: 2025-10-07 22:01:09.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:09 compute-0 podman[219666]: 2025-10-07 22:01:09.827842349 +0000 UTC m=+0.071807990 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:01:09 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:09.860 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:14 compute-0 nova_compute[192716]: 2025-10-07 22:01:14.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:14 compute-0 nova_compute[192716]: 2025-10-07 22:01:14.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:19 compute-0 nova_compute[192716]: 2025-10-07 22:01:19.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:19 compute-0 nova_compute[192716]: 2025-10-07 22:01:19.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:23 compute-0 podman[219690]: 2025-10-07 22:01:23.850173995 +0000 UTC m=+0.079507392 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.4, tcib_managed=true)
Oct 07 22:01:23 compute-0 podman[219689]: 2025-10-07 22:01:23.850913237 +0000 UTC m=+0.089837091 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:01:23 compute-0 nova_compute[192716]: 2025-10-07 22:01:23.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:24 compute-0 nova_compute[192716]: 2025-10-07 22:01:24.192 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:24 compute-0 nova_compute[192716]: 2025-10-07 22:01:24.193 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:24 compute-0 nova_compute[192716]: 2025-10-07 22:01:24.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:24 compute-0 nova_compute[192716]: 2025-10-07 22:01:24.705 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 22:01:24 compute-0 nova_compute[192716]: 2025-10-07 22:01:24.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:25 compute-0 nova_compute[192716]: 2025-10-07 22:01:25.261 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:25 compute-0 nova_compute[192716]: 2025-10-07 22:01:25.261 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:25 compute-0 nova_compute[192716]: 2025-10-07 22:01:25.269 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 22:01:25 compute-0 nova_compute[192716]: 2025-10-07 22:01:25.270 2 INFO nova.compute.claims [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Claim successful on node compute-0.ctlplane.example.com
Oct 07 22:01:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:25.624 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:25.625 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:25.625 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.314 2 DEBUG nova.scheduler.client.report [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.332 2 DEBUG nova.scheduler.client.report [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.333 2 DEBUG nova.compute.provider_tree [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.354 2 DEBUG nova.scheduler.client.report [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.385 2 DEBUG nova.scheduler.client.report [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.431 2 DEBUG nova.compute.provider_tree [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.497 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.497 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:01:26 compute-0 nova_compute[192716]: 2025-10-07 22:01:26.938 2 DEBUG nova.scheduler.client.report [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:01:27 compute-0 nova_compute[192716]: 2025-10-07 22:01:27.450 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.188s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:27 compute-0 nova_compute[192716]: 2025-10-07 22:01:27.451 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 22:01:27 compute-0 podman[219730]: 2025-10-07 22:01:27.828792298 +0000 UTC m=+0.063142953 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:01:27 compute-0 nova_compute[192716]: 2025-10-07 22:01:27.964 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 22:01:27 compute-0 nova_compute[192716]: 2025-10-07 22:01:27.964 2 DEBUG nova.network.neutron [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 22:01:27 compute-0 nova_compute[192716]: 2025-10-07 22:01:27.965 2 WARNING neutronclient.v2_0.client [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:01:27 compute-0 nova_compute[192716]: 2025-10-07 22:01:27.966 2 WARNING neutronclient.v2_0.client [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:01:28 compute-0 nova_compute[192716]: 2025-10-07 22:01:28.472 2 INFO nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 22:01:28 compute-0 nova_compute[192716]: 2025-10-07 22:01:28.981 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 22:01:29 compute-0 nova_compute[192716]: 2025-10-07 22:01:29.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:29 compute-0 podman[203153]: time="2025-10-07T22:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:01:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:01:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 07 22:01:29 compute-0 nova_compute[192716]: 2025-10-07 22:01:29.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:29 compute-0 nova_compute[192716]: 2025-10-07 22:01:29.987 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:29 compute-0 nova_compute[192716]: 2025-10-07 22:01:29.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:29 compute-0 nova_compute[192716]: 2025-10-07 22:01:29.999 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.001 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.001 2 INFO nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Creating image(s)
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.002 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "/var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.002 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "/var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.003 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "/var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.004 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.008 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.011 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.087 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.088 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.088 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.089 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.094 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.094 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:30 compute-0 sshd-session[219754]: Connection closed by 200.7.37.26 port 50878
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.154 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.155 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.183 2 DEBUG nova.network.neutron [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Successfully created port: a12818fb-48f8-4828-9483-8e2f0c930534 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.192 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.193 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.193 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.251 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.252 2 DEBUG nova.virt.disk.api [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Checking if we can resize image /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.253 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.339 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.341 2 DEBUG nova.virt.disk.api [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Cannot resize image /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.342 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.342 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Ensure instance console log exists: /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.343 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.344 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.344 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:30 compute-0 nova_compute[192716]: 2025-10-07 22:01:30.992 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: ERROR   22:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: ERROR   22:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: ERROR   22:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: ERROR   22:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: ERROR   22:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:01:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.451 2 DEBUG nova.network.neutron [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Successfully updated port: a12818fb-48f8-4828-9483-8e2f0c930534 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.502 2 DEBUG nova.compute.manager [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-changed-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.502 2 DEBUG nova.compute.manager [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Refreshing instance network info cache due to event network-changed-a12818fb-48f8-4828-9483-8e2f0c930534. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.503 2 DEBUG oslo_concurrency.lockutils [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-05a008ac-6976-48f0-8fc6-2795863bdf61" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.504 2 DEBUG oslo_concurrency.lockutils [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-05a008ac-6976-48f0-8fc6-2795863bdf61" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.504 2 DEBUG nova.network.neutron [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Refreshing network info cache for port a12818fb-48f8-4828-9483-8e2f0c930534 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.506 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 22:01:31 compute-0 nova_compute[192716]: 2025-10-07 22:01:31.959 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "refresh_cache-05a008ac-6976-48f0-8fc6-2795863bdf61" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:01:32 compute-0 nova_compute[192716]: 2025-10-07 22:01:32.012 2 WARNING neutronclient.v2_0.client [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:01:32 compute-0 nova_compute[192716]: 2025-10-07 22:01:32.159 2 DEBUG nova.network.neutron [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:01:32 compute-0 nova_compute[192716]: 2025-10-07 22:01:32.927 2 DEBUG nova.network.neutron [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:01:33 compute-0 nova_compute[192716]: 2025-10-07 22:01:33.440 2 DEBUG oslo_concurrency.lockutils [req-af2378b2-b88f-471e-be5e-d7db762bb541 req-9d896741-d201-4c06-a350-3ec1ae8a71d6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-05a008ac-6976-48f0-8fc6-2795863bdf61" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:01:33 compute-0 nova_compute[192716]: 2025-10-07 22:01:33.441 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquired lock "refresh_cache-05a008ac-6976-48f0-8fc6-2795863bdf61" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:01:33 compute-0 nova_compute[192716]: 2025-10-07 22:01:33.441 2 DEBUG nova.network.neutron [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:01:33 compute-0 nova_compute[192716]: 2025-10-07 22:01:33.505 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:33 compute-0 nova_compute[192716]: 2025-10-07 22:01:33.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:34 compute-0 nova_compute[192716]: 2025-10-07 22:01:34.494 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:34 compute-0 nova_compute[192716]: 2025-10-07 22:01:34.494 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:34 compute-0 nova_compute[192716]: 2025-10-07 22:01:34.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:34 compute-0 nova_compute[192716]: 2025-10-07 22:01:34.689 2 DEBUG nova.network.neutron [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:01:34 compute-0 nova_compute[192716]: 2025-10-07 22:01:34.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:34 compute-0 podman[219770]: 2025-10-07 22:01:34.90620108 +0000 UTC m=+0.134552339 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 22:01:34 compute-0 nova_compute[192716]: 2025-10-07 22:01:34.953 2 WARNING neutronclient.v2_0.client [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.004 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.004 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.005 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.005 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.195 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.197 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.238 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.239 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5866MB free_disk=73.30598449707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.240 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.240 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.320 2 DEBUG nova.network.neutron [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Updating instance_info_cache with network_info: [{"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.826 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Releasing lock "refresh_cache-05a008ac-6976-48f0-8fc6-2795863bdf61" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.827 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Instance network_info: |[{"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.832 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Start _get_guest_xml network_info=[{"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.837 2 WARNING nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.839 2 DEBUG nova.virt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-907284284', uuid='05a008ac-6976-48f0-8fc6-2795863bdf61'), owner=OwnerMeta(userid='db99335261504aa7b84c7d30ec17d679', username='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin', projectid='3ad27d63f39845acba6b21828806b82a', projectname='tempest-TestExecuteHostMaintenanceStrategy-152687663'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874495.8396537) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.843 2 DEBUG nova.virt.libvirt.host [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.845 2 DEBUG nova.virt.libvirt.host [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.849 2 DEBUG nova.virt.libvirt.host [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.850 2 DEBUG nova.virt.libvirt.host [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.850 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.851 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.852 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.852 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.853 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.853 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.854 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.854 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.855 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.856 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.856 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.857 2 DEBUG nova.virt.hardware [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.864 2 DEBUG nova.virt.libvirt.vif [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-907284284',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-907284284',id=13,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-swr0k1fo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:01:29Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=05a008ac-6976-48f0-8fc6-2795863bdf61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.865 2 DEBUG nova.network.os_vif_util [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.867 2 DEBUG nova.network.os_vif_util [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:01:35 compute-0 nova_compute[192716]: 2025-10-07 22:01:35.869 2 DEBUG nova.objects.instance [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lazy-loading 'pci_devices' on Instance uuid 05a008ac-6976-48f0-8fc6-2795863bdf61 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.296 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 05a008ac-6976-48f0-8fc6-2795863bdf61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.297 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.297 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:01:35 up  1:10,  0 user,  load average: 0.01, 0.17, 0.30\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_3ad27d63f39845acba6b21828806b82a': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.347 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.377 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] End _get_guest_xml xml=<domain type="kvm">
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <uuid>05a008ac-6976-48f0-8fc6-2795863bdf61</uuid>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <name>instance-0000000d</name>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-907284284</nova:name>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:01:35</nova:creationTime>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:01:36 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:01:36 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:user uuid="db99335261504aa7b84c7d30ec17d679">tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin</nova:user>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:project uuid="3ad27d63f39845acba6b21828806b82a">tempest-TestExecuteHostMaintenanceStrategy-152687663</nova:project>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         <nova:port uuid="a12818fb-48f8-4828-9483-8e2f0c930534">
Oct 07 22:01:36 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <system>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <entry name="serial">05a008ac-6976-48f0-8fc6-2795863bdf61</entry>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <entry name="uuid">05a008ac-6976-48f0-8fc6-2795863bdf61</entry>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </system>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <os>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </os>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <features>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </features>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.config"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:64:13:25"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <target dev="tapa12818fb-48"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </interface>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/console.log" append="off"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <video>
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </video>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:01:36 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:01:36 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:01:36 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:01:36 compute-0 nova_compute[192716]: </domain>
Oct 07 22:01:36 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.378 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Preparing to wait for external event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.378 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.379 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.379 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.380 2 DEBUG nova.virt.libvirt.vif [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-907284284',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-907284284',id=13,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-swr0k1fo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:01:29Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=05a008ac-6976-48f0-8fc6-2795863bdf61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.380 2 DEBUG nova.network.os_vif_util [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.380 2 DEBUG nova.network.os_vif_util [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.380 2 DEBUG os_vif [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.381 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b27f547c-1ced-5d32-95a1-4c70ae14c3d0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa12818fb-48, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa12818fb-48, col_values=(('qos', UUID('b54e8b96-3786-410d-a31f-9ac742ce35db')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa12818fb-48, col_values=(('external_ids', {'iface-id': 'a12818fb-48f8-4828-9483-8e2f0c930534', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:13:25', 'vm-uuid': '05a008ac-6976-48f0-8fc6-2795863bdf61'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 NetworkManager[51722]: <info>  [1759874496.3910] manager: (tapa12818fb-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.399 2 INFO os_vif [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48')
Oct 07 22:01:36 compute-0 podman[219799]: 2025-10-07 22:01:36.84620477 +0000 UTC m=+0.079226834 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:01:36 compute-0 nova_compute[192716]: 2025-10-07 22:01:36.855 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.367 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.367 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.863 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.940 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.941 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.941 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] No VIF found with MAC fa:16:3e:64:13:25, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 22:01:37 compute-0 nova_compute[192716]: 2025-10-07 22:01:37.941 2 INFO nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Using config drive
Oct 07 22:01:38 compute-0 nova_compute[192716]: 2025-10-07 22:01:38.452 2 WARNING neutronclient.v2_0.client [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:01:38 compute-0 nova_compute[192716]: 2025-10-07 22:01:38.887 2 INFO nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Creating config drive at /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.config
Oct 07 22:01:38 compute-0 nova_compute[192716]: 2025-10-07 22:01:38.892 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpziyj9vdm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:01:38 compute-0 nova_compute[192716]: 2025-10-07 22:01:38.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:01:38 compute-0 nova_compute[192716]: 2025-10-07 22:01:38.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.023 2 DEBUG oslo_concurrency.processutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpziyj9vdm" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:01:39 compute-0 kernel: tapa12818fb-48: entered promiscuous mode
Oct 07 22:01:39 compute-0 NetworkManager[51722]: <info>  [1759874499.1099] manager: (tapa12818fb-48): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 ovn_controller[94904]: 2025-10-07T22:01:39Z|00107|binding|INFO|Claiming lport a12818fb-48f8-4828-9483-8e2f0c930534 for this chassis.
Oct 07 22:01:39 compute-0 ovn_controller[94904]: 2025-10-07T22:01:39Z|00108|binding|INFO|a12818fb-48f8-4828-9483-8e2f0c930534: Claiming fa:16:3e:64:13:25 10.100.0.14
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 systemd-udevd[219835]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:01:39 compute-0 systemd-machined[152719]: New machine qemu-8-instance-0000000d.
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.175 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:13:25 10.100.0.14'], port_security=['fa:16:3e:64:13:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '05a008ac-6976-48f0-8fc6-2795863bdf61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a12818fb-48f8-4828-9483-8e2f0c930534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.176 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a12818fb-48f8-4828-9483-8e2f0c930534 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 bound to our chassis
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.177 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:01:39 compute-0 NetworkManager[51722]: <info>  [1759874499.1837] device (tapa12818fb-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:01:39 compute-0 NetworkManager[51722]: <info>  [1759874499.1847] device (tapa12818fb-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.190 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a4dee0-72a1-4161-877a-d64189fe4922]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.191 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap726154fe-b1 in ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.192 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap726154fe-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.193 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[37f4ca4f-5a3d-43f8-b7fa-eeea10df5d85]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.194 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ec5ac2-05e5-4ef5-b905-11817542dc81]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.204 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[b2700795-b8a1-4926-b3db-a2d63e1808d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000d.
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.233 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6b0e8d13-3fcc-411f-a64b-861a23033a51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_controller[94904]: 2025-10-07T22:01:39Z|00109|binding|INFO|Setting lport a12818fb-48f8-4828-9483-8e2f0c930534 ovn-installed in OVS
Oct 07 22:01:39 compute-0 ovn_controller[94904]: 2025-10-07T22:01:39Z|00110|binding|INFO|Setting lport a12818fb-48f8-4828-9483-8e2f0c930534 up in Southbound
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.266 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[3df42c6c-c883-4e31-8950-4eb4e35488b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 NetworkManager[51722]: <info>  [1759874499.2721] manager: (tap726154fe-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.271 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[554930a2-e2b7-40e8-897a-388e6454ada7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.305 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[19d74a56-56a4-48c2-bd5f-07ce78b95115]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.308 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e8898761-e64e-4872-a741-387777926700]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 NetworkManager[51722]: <info>  [1759874499.3359] device (tap726154fe-b0): carrier: link connected
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.337 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[8cbd4b33-2990-482a-b2a7-24aecf965fbe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.360 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe169b1-5db5-4587-a062-6623beda8510]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423291, 'reachable_time': 41705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219870, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.381 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c407510f-a88a-4f30-a537-6cb8f304f272]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:b22e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423291, 'tstamp': 423291}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219872, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.401 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[15634e37-a503-4e75-be30-95685427bde6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423291, 'reachable_time': 41705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219873, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.435 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[96e55948-dc4f-497c-9d57-23b1e59bb5d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.496 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6328918f-3de0-4f51-9443-48ca8329bca7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.497 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.497 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.498 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:39 compute-0 NetworkManager[51722]: <info>  [1759874499.5005] manager: (tap726154fe-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 kernel: tap726154fe-b0: entered promiscuous mode
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.504 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:01:39 compute-0 ovn_controller[94904]: 2025-10-07T22:01:39Z|00111|binding|INFO|Releasing lport b6dfddd4-019f-4508-ab9b-37759605366f from this chassis (sb_readonly=0)
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.523 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0e539803-ff1f-477f-b9db-b6ec1a364cfe]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.524 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.524 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.524 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 726154fe-bda6-431d-b983-7caa973a9e17 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.524 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.525 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9ec4a507-ca97-4b21-a2aa-de60f751420a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.525 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.525 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[04d56be8-5ac0-42a0-a565-243bfad482bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.526 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:01:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:01:39.526 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'env', 'PROCESS_TAG=haproxy-726154fe-bda6-431d-b983-7caa973a9e17', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/726154fe-bda6-431d-b983-7caa973a9e17.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:01:39 compute-0 nova_compute[192716]: 2025-10-07 22:01:39.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:40 compute-0 podman[219912]: 2025-10-07 22:01:39.915999632 +0000 UTC m=+0.023822868 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:01:40 compute-0 podman[219912]: 2025-10-07 22:01:40.092044626 +0000 UTC m=+0.199867822 container create 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Oct 07 22:01:40 compute-0 systemd[1]: Started libpod-conmon-5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf.scope.
Oct 07 22:01:40 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:01:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6df0ed26343527536356c1ce148cf585a8e4f7d49f6a214bfdcdd8d8a79aa858/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.247 2 DEBUG nova.compute.manager [req-521f62f4-09f9-4cbb-a600-90323d3866df req-32a0e301-49bb-4e59-9dda-a855b7efbc39 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.249 2 DEBUG oslo_concurrency.lockutils [req-521f62f4-09f9-4cbb-a600-90323d3866df req-32a0e301-49bb-4e59-9dda-a855b7efbc39 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.250 2 DEBUG oslo_concurrency.lockutils [req-521f62f4-09f9-4cbb-a600-90323d3866df req-32a0e301-49bb-4e59-9dda-a855b7efbc39 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.250 2 DEBUG oslo_concurrency.lockutils [req-521f62f4-09f9-4cbb-a600-90323d3866df req-32a0e301-49bb-4e59-9dda-a855b7efbc39 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.251 2 DEBUG nova.compute.manager [req-521f62f4-09f9-4cbb-a600-90323d3866df req-32a0e301-49bb-4e59-9dda-a855b7efbc39 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Processing event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.252 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.257 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.264 2 INFO nova.virt.libvirt.driver [-] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Instance spawned successfully.
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.265 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 22:01:40 compute-0 podman[219912]: 2025-10-07 22:01:40.266754631 +0000 UTC m=+0.374577877 container init 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 22:01:40 compute-0 podman[219926]: 2025-10-07 22:01:40.271131877 +0000 UTC m=+0.135048373 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Oct 07 22:01:40 compute-0 podman[219912]: 2025-10-07 22:01:40.277199652 +0000 UTC m=+0.385022888 container start 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:01:40 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [NOTICE]   (219954) : New worker (219956) forked
Oct 07 22:01:40 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [NOTICE]   (219954) : Loading success.
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.821 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.822 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.823 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.824 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.825 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:01:40 compute-0 nova_compute[192716]: 2025-10-07 22:01:40.826 2 DEBUG nova.virt.libvirt.driver [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:01:41 compute-0 nova_compute[192716]: 2025-10-07 22:01:41.344 2 INFO nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Took 11.35 seconds to spawn the instance on the hypervisor.
Oct 07 22:01:41 compute-0 nova_compute[192716]: 2025-10-07 22:01:41.346 2 DEBUG nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:01:41 compute-0 nova_compute[192716]: 2025-10-07 22:01:41.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:41 compute-0 nova_compute[192716]: 2025-10-07 22:01:41.887 2 INFO nova.compute.manager [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Took 16.67 seconds to build instance.
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.336 2 DEBUG nova.compute.manager [req-74326ef1-ae30-48a2-a8d7-e7884ee3b11b req-d3676b6d-4e15-4258-80dd-305c0bac501a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.337 2 DEBUG oslo_concurrency.lockutils [req-74326ef1-ae30-48a2-a8d7-e7884ee3b11b req-d3676b6d-4e15-4258-80dd-305c0bac501a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.338 2 DEBUG oslo_concurrency.lockutils [req-74326ef1-ae30-48a2-a8d7-e7884ee3b11b req-d3676b6d-4e15-4258-80dd-305c0bac501a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.338 2 DEBUG oslo_concurrency.lockutils [req-74326ef1-ae30-48a2-a8d7-e7884ee3b11b req-d3676b6d-4e15-4258-80dd-305c0bac501a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.339 2 DEBUG nova.compute.manager [req-74326ef1-ae30-48a2-a8d7-e7884ee3b11b req-d3676b6d-4e15-4258-80dd-305c0bac501a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.339 2 WARNING nova.compute.manager [req-74326ef1-ae30-48a2-a8d7-e7884ee3b11b req-d3676b6d-4e15-4258-80dd-305c0bac501a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received unexpected event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with vm_state active and task_state None.
Oct 07 22:01:42 compute-0 nova_compute[192716]: 2025-10-07 22:01:42.392 2 DEBUG oslo_concurrency.lockutils [None req-a8b4d20f-f0ce-471c-8f04-7118a01e1879 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.200s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:01:44 compute-0 nova_compute[192716]: 2025-10-07 22:01:44.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:46 compute-0 nova_compute[192716]: 2025-10-07 22:01:46.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:49 compute-0 nova_compute[192716]: 2025-10-07 22:01:49.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:51 compute-0 nova_compute[192716]: 2025-10-07 22:01:51.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:51 compute-0 ovn_controller[94904]: 2025-10-07T22:01:51Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:64:13:25 10.100.0.14
Oct 07 22:01:51 compute-0 ovn_controller[94904]: 2025-10-07T22:01:51Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:13:25 10.100.0.14
Oct 07 22:01:54 compute-0 nova_compute[192716]: 2025-10-07 22:01:54.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:54 compute-0 podman[219975]: 2025-10-07 22:01:54.822101216 +0000 UTC m=+0.065129335 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:01:54 compute-0 podman[219976]: 2025-10-07 22:01:54.835901996 +0000 UTC m=+0.065988390 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct 07 22:01:56 compute-0 nova_compute[192716]: 2025-10-07 22:01:56.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:58 compute-0 nova_compute[192716]: 2025-10-07 22:01:58.494 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Creating tmpfile /var/lib/nova/instances/tmpus27fwmk to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:01:58 compute-0 nova_compute[192716]: 2025-10-07 22:01:58.495 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:01:58 compute-0 nova_compute[192716]: 2025-10-07 22:01:58.511 2 DEBUG nova.compute.manager [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpus27fwmk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:01:58 compute-0 podman[220014]: 2025-10-07 22:01:58.634264162 +0000 UTC m=+0.092816386 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:01:59 compute-0 nova_compute[192716]: 2025-10-07 22:01:59.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:01:59 compute-0 podman[203153]: time="2025-10-07T22:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:01:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:01:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 07 22:02:00 compute-0 nova_compute[192716]: 2025-10-07 22:02:00.562 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: ERROR   22:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: ERROR   22:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: ERROR   22:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: ERROR   22:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: ERROR   22:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:02:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:02:01 compute-0 nova_compute[192716]: 2025-10-07 22:02:01.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:04 compute-0 nova_compute[192716]: 2025-10-07 22:02:04.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:05 compute-0 nova_compute[192716]: 2025-10-07 22:02:05.808 2 DEBUG nova.compute.manager [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpus27fwmk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fd9b0d7e-e882-4574-9e62-a1d142bb6a16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:02:05 compute-0 podman[220039]: 2025-10-07 22:02:05.902563733 +0000 UTC m=+0.134268866 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251007)
Oct 07 22:02:06 compute-0 nova_compute[192716]: 2025-10-07 22:02:06.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:06 compute-0 nova_compute[192716]: 2025-10-07 22:02:06.826 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-fd9b0d7e-e882-4574-9e62-a1d142bb6a16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:02:06 compute-0 nova_compute[192716]: 2025-10-07 22:02:06.827 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-fd9b0d7e-e882-4574-9e62-a1d142bb6a16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:02:06 compute-0 nova_compute[192716]: 2025-10-07 22:02:06.828 2 DEBUG nova.network.neutron [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:02:07 compute-0 nova_compute[192716]: 2025-10-07 22:02:07.338 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:07 compute-0 podman[220066]: 2025-10-07 22:02:07.844682227 +0000 UTC m=+0.078537283 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 22:02:07 compute-0 nova_compute[192716]: 2025-10-07 22:02:07.998 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.193 2 DEBUG nova.network.neutron [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Updating instance_info_cache with network_info: [{"id": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "address": "fa:16:3e:4d:8e:a6", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4e362fa-a4", "ovs_interfaceid": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.701 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-fd9b0d7e-e882-4574-9e62-a1d142bb6a16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.717 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpus27fwmk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fd9b0d7e-e882-4574-9e62-a1d142bb6a16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.718 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Creating instance directory: /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.718 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Creating disk.info with the contents: {'/var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk': 'qcow2', '/var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.719 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:02:08 compute-0 nova_compute[192716]: 2025-10-07 22:02:08.720 2 DEBUG nova.objects.instance [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fd9b0d7e-e882-4574-9e62-a1d142bb6a16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.228 2 DEBUG oslo_utils.imageutils.format_inspector [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.232 2 DEBUG oslo_utils.imageutils.format_inspector [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.233 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.326 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.327 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.328 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.328 2 DEBUG oslo_utils.imageutils.format_inspector [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.334 2 DEBUG oslo_utils.imageutils.format_inspector [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.335 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.397 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.398 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.429 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.430 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.430 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.495 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.496 2 DEBUG nova.virt.disk.api [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.496 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:09 compute-0 ovn_controller[94904]: 2025-10-07T22:02:09Z|00112|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.577 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.577 2 DEBUG nova.virt.disk.api [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.578 2 DEBUG nova.objects.instance [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid fd9b0d7e-e882-4574-9e62-a1d142bb6a16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:02:09 compute-0 nova_compute[192716]: 2025-10-07 22:02:09.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.084 2 DEBUG nova.objects.base [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<fd9b0d7e-e882-4574-9e62-a1d142bb6a16> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.085 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.124 2 DEBUG oslo_concurrency.processutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk.config 497664" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.125 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.126 2 DEBUG nova.virt.libvirt.vif [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:01:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1032540532',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1032540532',id=12,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-7m50so7e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:01:19Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=fd9b0d7e-e882-4574-9e62-a1d142bb6a16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "address": "fa:16:3e:4d:8e:a6", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc4e362fa-a4", "ovs_interfaceid": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.127 2 DEBUG nova.network.os_vif_util [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "address": "fa:16:3e:4d:8e:a6", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc4e362fa-a4", "ovs_interfaceid": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.128 2 DEBUG nova.network.os_vif_util [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4e362fa-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.128 2 DEBUG os_vif [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4e362fa-a4') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.130 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.131 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.132 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '13101ae0-c590-504e-8b13-26350af0ea57', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.141 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4e362fa-a4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.142 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc4e362fa-a4, col_values=(('qos', UUID('b089b8e9-fae4-43a2-be94-56921a953ac5')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.142 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc4e362fa-a4, col_values=(('external_ids', {'iface-id': 'c4e362fa-a4d3-43d8-80c2-29fa599f7f3f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:8e:a6', 'vm-uuid': 'fd9b0d7e-e882-4574-9e62-a1d142bb6a16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 NetworkManager[51722]: <info>  [1759874530.1454] manager: (tapc4e362fa-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.151 2 INFO os_vif [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4e362fa-a4')
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.151 2 DEBUG nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.152 2 DEBUG nova.compute.manager [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpus27fwmk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fd9b0d7e-e882-4574-9e62-a1d142bb6a16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.152 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.232 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:10 compute-0 podman[220105]: 2025-10-07 22:02:10.84479718 +0000 UTC m=+0.081092807 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 07 22:02:10 compute-0 nova_compute[192716]: 2025-10-07 22:02:10.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:11.096 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:02:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:11.098 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:02:11 compute-0 nova_compute[192716]: 2025-10-07 22:02:11.327 2 DEBUG nova.network.neutron [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Port c4e362fa-a4d3-43d8-80c2-29fa599f7f3f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:02:11 compute-0 nova_compute[192716]: 2025-10-07 22:02:11.338 2 DEBUG nova.compute.manager [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpus27fwmk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fd9b0d7e-e882-4574-9e62-a1d142bb6a16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:02:14 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 22:02:14 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 22:02:14 compute-0 NetworkManager[51722]: <info>  [1759874534.6430] manager: (tapc4e362fa-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Oct 07 22:02:14 compute-0 kernel: tapc4e362fa-a4: entered promiscuous mode
Oct 07 22:02:14 compute-0 ovn_controller[94904]: 2025-10-07T22:02:14Z|00113|binding|INFO|Claiming lport c4e362fa-a4d3-43d8-80c2-29fa599f7f3f for this additional chassis.
Oct 07 22:02:14 compute-0 ovn_controller[94904]: 2025-10-07T22:02:14Z|00114|binding|INFO|c4e362fa-a4d3-43d8-80c2-29fa599f7f3f: Claiming fa:16:3e:4d:8e:a6 10.100.0.11
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.657 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:8e:a6 10.100.0.11'], port_security=['fa:16:3e:4d:8e:a6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fd9b0d7e-e882-4574-9e62-a1d142bb6a16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.659 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c4e362fa-a4d3-43d8-80c2-29fa599f7f3f in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.661 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:02:14 compute-0 ovn_controller[94904]: 2025-10-07T22:02:14Z|00115|binding|INFO|Setting lport c4e362fa-a4d3-43d8-80c2-29fa599f7f3f ovn-installed in OVS
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.692 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf07e25-3fd4-43c9-8f66-348b6ed9c37b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 systemd-udevd[220160]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:02:14 compute-0 systemd-machined[152719]: New machine qemu-9-instance-0000000c.
Oct 07 22:02:14 compute-0 NetworkManager[51722]: <info>  [1759874534.7257] device (tapc4e362fa-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:02:14 compute-0 NetworkManager[51722]: <info>  [1759874534.7271] device (tapc4e362fa-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:02:14 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.731 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[f72fa86a-f9fc-4d5e-88e7-14138423ac83]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.734 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[ec42c677-34b8-4b25-9391-df9958d1b79b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.772 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[af49e9aa-344e-4ab8-9c27-341ea24c3763]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.803 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[574a79c7-fee0-4592-84cb-78658fde385e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423291, 'reachable_time': 41705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220171, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.821 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4885cbe2-9f2f-48ac-b9a0-9139b9f7976d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423304, 'tstamp': 423304}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220173, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423307, 'tstamp': 423307}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220173, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.822 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 nova_compute[192716]: 2025-10-07 22:02:14.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.826 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.827 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.827 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.827 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:14.829 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[34b887b1-5fa5-40d0-ac94-b8163d3699e6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:15 compute-0 nova_compute[192716]: 2025-10-07 22:02:15.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:17.099 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:17 compute-0 ovn_controller[94904]: 2025-10-07T22:02:17Z|00116|binding|INFO|Claiming lport c4e362fa-a4d3-43d8-80c2-29fa599f7f3f for this chassis.
Oct 07 22:02:17 compute-0 ovn_controller[94904]: 2025-10-07T22:02:17Z|00117|binding|INFO|c4e362fa-a4d3-43d8-80c2-29fa599f7f3f: Claiming fa:16:3e:4d:8e:a6 10.100.0.11
Oct 07 22:02:17 compute-0 ovn_controller[94904]: 2025-10-07T22:02:17Z|00118|binding|INFO|Setting lport c4e362fa-a4d3-43d8-80c2-29fa599f7f3f up in Southbound
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.444 2 INFO nova.compute.manager [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Post operation of migration started
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.444 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.756 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.757 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.834 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-fd9b0d7e-e882-4574-9e62-a1d142bb6a16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.834 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-fd9b0d7e-e882-4574-9e62-a1d142bb6a16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:02:18 compute-0 nova_compute[192716]: 2025-10-07 22:02:18.835 2 DEBUG nova.network.neutron [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:02:19 compute-0 nova_compute[192716]: 2025-10-07 22:02:19.343 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:19 compute-0 nova_compute[192716]: 2025-10-07 22:02:19.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.258 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.261 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.396 2 DEBUG nova.network.neutron [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Updating instance_info_cache with network_info: [{"id": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "address": "fa:16:3e:4d:8e:a6", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4e362fa-a4", "ovs_interfaceid": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.773 2 WARNING nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] While synchronizing instance power states, found 1 instances in the database and 2 instances on the hypervisor.
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.773 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Triggering sync for uuid 05a008ac-6976-48f0-8fc6-2795863bdf61 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.774 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.775 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:20 compute-0 nova_compute[192716]: 2025-10-07 22:02:20.904 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-fd9b0d7e-e882-4574-9e62-a1d142bb6a16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:02:21 compute-0 nova_compute[192716]: 2025-10-07 22:02:21.286 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.512s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:21 compute-0 nova_compute[192716]: 2025-10-07 22:02:21.421 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:21 compute-0 nova_compute[192716]: 2025-10-07 22:02:21.422 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:21 compute-0 nova_compute[192716]: 2025-10-07 22:02:21.422 2 DEBUG oslo_concurrency.lockutils [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:21 compute-0 nova_compute[192716]: 2025-10-07 22:02:21.426 2 INFO nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:02:21 compute-0 virtqemud[192532]: Domain id=9 name='instance-0000000c' uuid=fd9b0d7e-e882-4574-9e62-a1d142bb6a16 is tainted: custom-monitor
Oct 07 22:02:22 compute-0 nova_compute[192716]: 2025-10-07 22:02:22.432 2 INFO nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:02:23 compute-0 nova_compute[192716]: 2025-10-07 22:02:23.440 2 INFO nova.virt.libvirt.driver [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:02:23 compute-0 nova_compute[192716]: 2025-10-07 22:02:23.445 2 DEBUG nova.compute.manager [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:02:23 compute-0 nova_compute[192716]: 2025-10-07 22:02:23.954 2 DEBUG nova.objects.instance [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:02:24 compute-0 nova_compute[192716]: 2025-10-07 22:02:24.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:24 compute-0 nova_compute[192716]: 2025-10-07 22:02:24.976 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:25 compute-0 nova_compute[192716]: 2025-10-07 22:02:25.084 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:25 compute-0 nova_compute[192716]: 2025-10-07 22:02:25.084 2 WARNING neutronclient.v2_0.client [None req-29aa0acb-54ff-426c-914f-f9f20d26bcae 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:25 compute-0 nova_compute[192716]: 2025-10-07 22:02:25.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:25.625 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:25.626 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:25.626 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:25 compute-0 podman[220195]: 2025-10-07 22:02:25.857542951 +0000 UTC m=+0.082132277 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 22:02:25 compute-0 podman[220196]: 2025-10-07 22:02:25.86167528 +0000 UTC m=+0.086864054 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:02:26 compute-0 nova_compute[192716]: 2025-10-07 22:02:26.504 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:27 compute-0 nova_compute[192716]: 2025-10-07 22:02:27.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:27 compute-0 nova_compute[192716]: 2025-10-07 22:02:27.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:02:28 compute-0 nova_compute[192716]: 2025-10-07 22:02:28.809 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:28 compute-0 nova_compute[192716]: 2025-10-07 22:02:28.810 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:28 compute-0 nova_compute[192716]: 2025-10-07 22:02:28.810 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:28 compute-0 nova_compute[192716]: 2025-10-07 22:02:28.810 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:28 compute-0 nova_compute[192716]: 2025-10-07 22:02:28.810 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:28 compute-0 nova_compute[192716]: 2025-10-07 22:02:28.824 2 INFO nova.compute.manager [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Terminating instance
Oct 07 22:02:28 compute-0 podman[220238]: 2025-10-07 22:02:28.861071553 +0000 UTC m=+0.090621783 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.342 2 DEBUG nova.compute.manager [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:02:29 compute-0 kernel: tapa12818fb-48 (unregistering): left promiscuous mode
Oct 07 22:02:29 compute-0 NetworkManager[51722]: <info>  [1759874549.3765] device (tapa12818fb-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00119|binding|INFO|Releasing lport a12818fb-48f8-4828-9483-8e2f0c930534 from this chassis (sb_readonly=0)
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00120|binding|INFO|Setting lport a12818fb-48f8-4828-9483-8e2f0c930534 down in Southbound
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00121|binding|INFO|Removing iface tapa12818fb-48 ovn-installed in OVS
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.408 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:13:25 10.100.0.14'], port_security=['fa:16:3e:64:13:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '05a008ac-6976-48f0-8fc6-2795863bdf61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a12818fb-48f8-4828-9483-8e2f0c930534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.409 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a12818fb-48f8-4828-9483-8e2f0c930534 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.412 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.444 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b4420e0c-534c-4ae7-a67f-a5b6d6cf3b19]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct 07 22:02:29 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Consumed 13.651s CPU time.
Oct 07 22:02:29 compute-0 systemd-machined[152719]: Machine qemu-8-instance-0000000d terminated.
Oct 07 22:02:29 compute-0 sshd-session[220236]: Invalid user sysadmin from 103.115.24.11 port 51960
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.487 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c2cf2e-693d-4522-a83f-7ad2e84d5974]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.490 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[045f372a-344b-4f95-9cc1-1c59999aae2e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 sshd-session[220236]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:02:29 compute-0 sshd-session[220236]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.519 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[cefbdbb0-a5a9-4201-a6ed-3072ed99b90e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.536 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4144c880-0527-4804-83db-3b04e6f09776]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423291, 'reachable_time': 41705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220277, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.550 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7acd1aa4-9914-4cfa-8ba1-7b022afe99f6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423304, 'tstamp': 423304}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220278, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423307, 'tstamp': 423307}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220278, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.551 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.558 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.558 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.559 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.559 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.560 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0835e0-d166-46eb-87b9-2b23fdb9c2c2]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 kernel: tapa12818fb-48: entered promiscuous mode
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 kernel: tapa12818fb-48 (unregistering): left promiscuous mode
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00122|binding|INFO|Claiming lport a12818fb-48f8-4828-9483-8e2f0c930534 for this chassis.
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00123|binding|INFO|a12818fb-48f8-4828-9483-8e2f0c930534: Claiming fa:16:3e:64:13:25 10.100.0.14
Oct 07 22:02:29 compute-0 NetworkManager[51722]: <info>  [1759874549.5702] manager: (tapa12818fb-48): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.579 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:13:25 10.100.0.14'], port_security=['fa:16:3e:64:13:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '05a008ac-6976-48f0-8fc6-2795863bdf61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a12818fb-48f8-4828-9483-8e2f0c930534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.579 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a12818fb-48f8-4828-9483-8e2f0c930534 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 bound to our chassis
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.581 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.587 2 DEBUG nova.compute.manager [req-63521b6f-4573-44ba-bbf0-33bf7cb7567a req-52aa8a6e-4dc3-4ebf-83c6-093cd860a953 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.588 2 DEBUG oslo_concurrency.lockutils [req-63521b6f-4573-44ba-bbf0-33bf7cb7567a req-52aa8a6e-4dc3-4ebf-83c6-093cd860a953 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.588 2 DEBUG oslo_concurrency.lockutils [req-63521b6f-4573-44ba-bbf0-33bf7cb7567a req-52aa8a6e-4dc3-4ebf-83c6-093cd860a953 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.588 2 DEBUG oslo_concurrency.lockutils [req-63521b6f-4573-44ba-bbf0-33bf7cb7567a req-52aa8a6e-4dc3-4ebf-83c6-093cd860a953 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.588 2 DEBUG nova.compute.manager [req-63521b6f-4573-44ba-bbf0-33bf7cb7567a req-52aa8a6e-4dc3-4ebf-83c6-093cd860a953 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.588 2 DEBUG nova.compute.manager [req-63521b6f-4573-44ba-bbf0-33bf7cb7567a req-52aa8a6e-4dc3-4ebf-83c6-093cd860a953 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00124|binding|INFO|Setting lport a12818fb-48f8-4828-9483-8e2f0c930534 ovn-installed in OVS
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00125|binding|INFO|Setting lport a12818fb-48f8-4828-9483-8e2f0c930534 up in Southbound
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00126|binding|INFO|Releasing lport a12818fb-48f8-4828-9483-8e2f0c930534 from this chassis (sb_readonly=1)
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00127|if_status|INFO|Dropped 2 log messages in last 359 seconds (most recently, 359 seconds ago) due to excessive rate
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00128|if_status|INFO|Not setting lport a12818fb-48f8-4828-9483-8e2f0c930534 down as sb is readonly
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00129|binding|INFO|Removing iface tapa12818fb-48 ovn-installed in OVS
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.603 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8f966273-e816-4ede-98de-b04d531aafe7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00130|binding|INFO|Releasing lport a12818fb-48f8-4828-9483-8e2f0c930534 from this chassis (sb_readonly=0)
Oct 07 22:02:29 compute-0 ovn_controller[94904]: 2025-10-07T22:02:29Z|00131|binding|INFO|Setting lport a12818fb-48f8-4828-9483-8e2f0c930534 down in Southbound
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.611 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:13:25 10.100.0.14'], port_security=['fa:16:3e:64:13:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '05a008ac-6976-48f0-8fc6-2795863bdf61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a12818fb-48f8-4828-9483-8e2f0c930534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.636 2 INFO nova.virt.libvirt.driver [-] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Instance destroyed successfully.
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.637 2 DEBUG nova.objects.instance [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lazy-loading 'resources' on Instance uuid 05a008ac-6976-48f0-8fc6-2795863bdf61 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.640 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8b2687-6126-4dd0-b405-b250e0e24a07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.643 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb9f4d8-ddef-43f0-849f-e590941f78cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.676 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[7cae1875-1342-437a-bc0a-17ba90e8fe1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.690 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8c60a25f-0db4-42fa-932c-67c0b622834f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 9, 'rx_bytes': 1756, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 9, 'rx_bytes': 1756, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423291, 'reachable_time': 41705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220299, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.712 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[350e320d-33c8-404e-a85c-32edff30056e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423304, 'tstamp': 423304}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220300, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423307, 'tstamp': 423307}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220300, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.714 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.720 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.720 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.721 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.721 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.722 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8b455b-bbc0-4e69-a6a1-3c682a2a73b6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.723 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a12818fb-48f8-4828-9483-8e2f0c930534 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.725 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.746 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf75d68-9af4-444a-8fc5-65e02ee864e3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 podman[203153]: time="2025-10-07T22:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:02:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:02:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3486 "" "Go-http-client/1.1"
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.790 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[c67079f0-2188-4107-9739-9c5412161723]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.795 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[254c7dae-aa42-47ef-87cb-cfaeb321b46c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.841 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[8a10f20d-3cf1-4a0d-a6b6-e2f360a931ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.870 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[111c3ce3-f93f-4be6-a910-48395e3cb0d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 11, 'rx_bytes': 1756, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 11, 'rx_bytes': 1756, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423291, 'reachable_time': 41705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220308, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.898 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[351bc51d-4167-4ca0-b937-2d2fafdf6c5c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423304, 'tstamp': 423304}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220309, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423307, 'tstamp': 423307}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220309, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.899 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.907 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.908 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.908 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.908 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:02:29 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:29.910 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[94b28f5c-7f0a-4add-bd35-224a39a645be]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:29 compute-0 nova_compute[192716]: 2025-10-07 22:02:29.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.142 2 DEBUG nova.virt.libvirt.vif [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-907284284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-907284284',id=13,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:01:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-swr0k1fo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:01:41Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=05a008ac-6976-48f0-8fc6-2795863bdf61,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.143 2 DEBUG nova.network.os_vif_util [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "a12818fb-48f8-4828-9483-8e2f0c930534", "address": "fa:16:3e:64:13:25", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa12818fb-48", "ovs_interfaceid": "a12818fb-48f8-4828-9483-8e2f0c930534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.144 2 DEBUG nova.network.os_vif_util [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.145 2 DEBUG os_vif [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.148 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa12818fb-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.153 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b54e8b96-3786-410d-a31f-9ac742ce35db) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.158 2 INFO os_vif [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:13:25,bridge_name='br-int',has_traffic_filtering=True,id=a12818fb-48f8-4828-9483-8e2f0c930534,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa12818fb-48')
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.159 2 INFO nova.virt.libvirt.driver [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Deleting instance files /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61_del
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.160 2 INFO nova.virt.libvirt.driver [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Deletion of /var/lib/nova/instances/05a008ac-6976-48f0-8fc6-2795863bdf61_del complete
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.674 2 INFO nova.compute.manager [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.675 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.675 2 DEBUG nova.compute.manager [-] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.675 2 DEBUG nova.network.neutron [-] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.675 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:30 compute-0 nova_compute[192716]: 2025-10-07 22:02:30.953 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.288 2 DEBUG nova.compute.manager [req-27ccb818-491f-4fa6-a0a2-738c4adc335d req-80859513-c4b4-4ed6-915d-05efc30f9fb2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-deleted-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.289 2 INFO nova.compute.manager [req-27ccb818-491f-4fa6-a0a2-738c4adc335d req-80859513-c4b4-4ed6-915d-05efc30f9fb2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Neutron deleted interface a12818fb-48f8-4828-9483-8e2f0c930534; detaching it from the instance and deleting it from the info cache
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.289 2 DEBUG nova.network.neutron [req-27ccb818-491f-4fa6-a0a2-738c4adc335d req-80859513-c4b4-4ed6-915d-05efc30f9fb2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: ERROR   22:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: ERROR   22:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: ERROR   22:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: ERROR   22:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: ERROR   22:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:02:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.669 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.670 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.670 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.671 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.671 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.671 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.671 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.672 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.672 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.672 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.673 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.673 2 WARNING nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received unexpected event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with vm_state active and task_state deleting.
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.673 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.674 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.674 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.674 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.675 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.675 2 WARNING nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received unexpected event network-vif-plugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with vm_state active and task_state deleting.
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.675 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.676 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.676 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.677 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.677 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.677 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.678 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.678 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.679 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.679 2 DEBUG oslo_concurrency.lockutils [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.680 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] No waiting events found dispatching network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.680 2 DEBUG nova.compute.manager [req-ed1a5259-5801-4e33-948e-db3963996630 req-823cd198-8d8e-4453-b752-b78ba2f23986 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Received event network-vif-unplugged-a12818fb-48f8-4828-9483-8e2f0c930534 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.710 2 DEBUG nova.network.neutron [-] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:02:31 compute-0 nova_compute[192716]: 2025-10-07 22:02:31.797 2 DEBUG nova.compute.manager [req-27ccb818-491f-4fa6-a0a2-738c4adc335d req-80859513-c4b4-4ed6-915d-05efc30f9fb2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Detach interface failed, port_id=a12818fb-48f8-4828-9483-8e2f0c930534, reason: Instance 05a008ac-6976-48f0-8fc6-2795863bdf61 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:02:31 compute-0 sshd-session[220236]: Failed password for invalid user sysadmin from 103.115.24.11 port 51960 ssh2
Oct 07 22:02:32 compute-0 nova_compute[192716]: 2025-10-07 22:02:32.216 2 INFO nova.compute.manager [-] [instance: 05a008ac-6976-48f0-8fc6-2795863bdf61] Took 1.54 seconds to deallocate network for instance.
Oct 07 22:02:32 compute-0 nova_compute[192716]: 2025-10-07 22:02:32.744 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:32 compute-0 nova_compute[192716]: 2025-10-07 22:02:32.745 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:32 compute-0 nova_compute[192716]: 2025-10-07 22:02:32.864 2 DEBUG nova.compute.provider_tree [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:02:33 compute-0 nova_compute[192716]: 2025-10-07 22:02:33.372 2 DEBUG nova.scheduler.client.report [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:02:33 compute-0 nova_compute[192716]: 2025-10-07 22:02:33.881 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.136s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:33 compute-0 nova_compute[192716]: 2025-10-07 22:02:33.901 2 INFO nova.scheduler.client.report [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Deleted allocations for instance 05a008ac-6976-48f0-8fc6-2795863bdf61
Oct 07 22:02:34 compute-0 sshd-session[220236]: Received disconnect from 103.115.24.11 port 51960:11: Bye Bye [preauth]
Oct 07 22:02:34 compute-0 sshd-session[220236]: Disconnected from invalid user sysadmin 103.115.24.11 port 51960 [preauth]
Oct 07 22:02:34 compute-0 nova_compute[192716]: 2025-10-07 22:02:34.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:34 compute-0 nova_compute[192716]: 2025-10-07 22:02:34.933 2 DEBUG oslo_concurrency.lockutils [None req-2bab8197-648d-4619-ae14-b388460bdc5d db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "05a008ac-6976-48f0-8fc6-2795863bdf61" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:34 compute-0 nova_compute[192716]: 2025-10-07 22:02:34.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:35 compute-0 nova_compute[192716]: 2025-10-07 22:02:35.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:35 compute-0 nova_compute[192716]: 2025-10-07 22:02:35.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:35 compute-0 nova_compute[192716]: 2025-10-07 22:02:35.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.392 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.393 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.393 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.394 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.394 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.411 2 INFO nova.compute.manager [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Terminating instance
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.506 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:02:36 compute-0 podman[220312]: 2025-10-07 22:02:36.683665376 +0000 UTC m=+0.125482201 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.934 2 DEBUG nova.compute.manager [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:02:36 compute-0 kernel: tapc4e362fa-a4 (unregistering): left promiscuous mode
Oct 07 22:02:36 compute-0 NetworkManager[51722]: <info>  [1759874556.9650] device (tapc4e362fa-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:02:36 compute-0 ovn_controller[94904]: 2025-10-07T22:02:36Z|00132|binding|INFO|Releasing lport c4e362fa-a4d3-43d8-80c2-29fa599f7f3f from this chassis (sb_readonly=0)
Oct 07 22:02:36 compute-0 ovn_controller[94904]: 2025-10-07T22:02:36Z|00133|binding|INFO|Setting lport c4e362fa-a4d3-43d8-80c2-29fa599f7f3f down in Southbound
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:36 compute-0 ovn_controller[94904]: 2025-10-07T22:02:36Z|00134|binding|INFO|Removing iface tapc4e362fa-a4 ovn-installed in OVS
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:36 compute-0 nova_compute[192716]: 2025-10-07 22:02:36.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:36.984 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:8e:a6 10.100.0.11'], port_security=['fa:16:3e:4d:8e:a6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fd9b0d7e-e882-4574-9e62-a1d142bb6a16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '15', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:02:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:36.985 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c4e362fa-a4d3-43d8-80c2-29fa599f7f3f in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:02:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:36.986 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 726154fe-bda6-431d-b983-7caa973a9e17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:02:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:36.988 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[09b1cd37-7d12-4b23-909f-a6351424d8f3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:36.988 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 namespace which is not needed anymore
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct 07 22:02:37 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 2.420s CPU time.
Oct 07 22:02:37 compute-0 systemd-machined[152719]: Machine qemu-9-instance-0000000c terminated.
Oct 07 22:02:37 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [NOTICE]   (219954) : haproxy version is 3.0.5-8e879a5
Oct 07 22:02:37 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [NOTICE]   (219954) : path to executable is /usr/sbin/haproxy
Oct 07 22:02:37 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [WARNING]  (219954) : Exiting Master process...
Oct 07 22:02:37 compute-0 podman[220364]: 2025-10-07 22:02:37.15705237 +0000 UTC m=+0.040649557 container kill 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 07 22:02:37 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [ALERT]    (219954) : Current worker (219956) exited with code 143 (Terminated)
Oct 07 22:02:37 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[219941]: [WARNING]  (219954) : All workers exited. Exiting... (0)
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 systemd[1]: libpod-5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf.scope: Deactivated successfully.
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.204 2 INFO nova.virt.libvirt.driver [-] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Instance destroyed successfully.
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.206 2 DEBUG nova.objects.instance [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lazy-loading 'resources' on Instance uuid fd9b0d7e-e882-4574-9e62-a1d142bb6a16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:02:37 compute-0 podman[220384]: 2025-10-07 22:02:37.227318063 +0000 UTC m=+0.042951473 container died 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 07 22:02:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf-userdata-shm.mount: Deactivated successfully.
Oct 07 22:02:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-6df0ed26343527536356c1ce148cf585a8e4f7d49f6a214bfdcdd8d8a79aa858-merged.mount: Deactivated successfully.
Oct 07 22:02:37 compute-0 podman[220384]: 2025-10-07 22:02:37.27489669 +0000 UTC m=+0.090530000 container cleanup 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 07 22:02:37 compute-0 systemd[1]: libpod-conmon-5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf.scope: Deactivated successfully.
Oct 07 22:02:37 compute-0 podman[220394]: 2025-10-07 22:02:37.291216432 +0000 UTC m=+0.098597724 container remove 5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.297 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4de357f7-2ab3-4d9e-a91a-d0ff6b896116]: (4, ("Tue Oct  7 10:02:37 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 (5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf)\n5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf\nTue Oct  7 10:02:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 (5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf)\n5d54d4672dbef2af578a659a60cf159e216d33924363c051feb22fc347742baf\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.298 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb68924-34e6-4016-800e-d97689b43b72]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.298 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.299 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[75f92635-fbdf-4391-8fbe-e5bc9e8715c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.299 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 kernel: tap726154fe-b0: left promiscuous mode
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.315 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1f726e-0bd1-4814-89ca-18ff3f55fdd1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.341 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[29270c60-ce3c-42d9-9e2e-2b919bc7a59c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.341 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[998d11bc-552b-4b14-bc38-ad968ca5fb0b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.354 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8adb9567-15e2-4181-8569-ad37242c0fb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423284, 'reachable_time': 17863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220430, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d726154fe\x2dbda6\x2d431d\x2db983\x2d7caa973a9e17.mount: Deactivated successfully.
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.357 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:02:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:02:37.357 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[9c48d61c-3e4a-4026-963c-2cf109533536]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.560 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.656 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.658 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.713 2 DEBUG nova.virt.libvirt.vif [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:01:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1032540532',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1032540532',id=12,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-7m50so7e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:02:24Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=fd9b0d7e-e882-4574-9e62-a1d142bb6a16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "address": "fa:16:3e:4d:8e:a6", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4e362fa-a4", "ovs_interfaceid": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.715 2 DEBUG nova.network.os_vif_util [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "address": "fa:16:3e:4d:8e:a6", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4e362fa-a4", "ovs_interfaceid": "c4e362fa-a4d3-43d8-80c2-29fa599f7f3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.717 2 DEBUG nova.network.os_vif_util [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4d:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4e362fa-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.718 2 DEBUG os_vif [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4d:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4e362fa-a4') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.723 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4e362fa-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b089b8e9-fae4-43a2-be94-56921a953ac5) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.731 2 INFO os_vif [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4d:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4e362fa-a4')
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.731 2 INFO nova.virt.libvirt.driver [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Deleting instance files /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16_del
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.732 2 INFO nova.virt.libvirt.driver [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Deletion of /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16_del complete
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.751 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json" returned: 1 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.751 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] '/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk --force-share --output=json' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.752 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000c, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/fd9b0d7e-e882-4574-9e62-a1d142bb6a16/disk
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.890 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.891 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.907 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.907 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.27510833740234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.907 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.908 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.948 2 DEBUG nova.compute.manager [req-22758c58-3da2-4d2d-9758-6b0cbc4a8db5 req-5910fc77-f49b-481f-a4eb-0eb98095bb48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Received event network-vif-unplugged-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.948 2 DEBUG oslo_concurrency.lockutils [req-22758c58-3da2-4d2d-9758-6b0cbc4a8db5 req-5910fc77-f49b-481f-a4eb-0eb98095bb48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.949 2 DEBUG oslo_concurrency.lockutils [req-22758c58-3da2-4d2d-9758-6b0cbc4a8db5 req-5910fc77-f49b-481f-a4eb-0eb98095bb48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.949 2 DEBUG oslo_concurrency.lockutils [req-22758c58-3da2-4d2d-9758-6b0cbc4a8db5 req-5910fc77-f49b-481f-a4eb-0eb98095bb48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.949 2 DEBUG nova.compute.manager [req-22758c58-3da2-4d2d-9758-6b0cbc4a8db5 req-5910fc77-f49b-481f-a4eb-0eb98095bb48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] No waiting events found dispatching network-vif-unplugged-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:37 compute-0 nova_compute[192716]: 2025-10-07 22:02:37.950 2 DEBUG nova.compute.manager [req-22758c58-3da2-4d2d-9758-6b0cbc4a8db5 req-5910fc77-f49b-481f-a4eb-0eb98095bb48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Received event network-vif-unplugged-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:02:38 compute-0 nova_compute[192716]: 2025-10-07 22:02:38.245 2 INFO nova.compute.manager [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 07 22:02:38 compute-0 nova_compute[192716]: 2025-10-07 22:02:38.246 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:02:38 compute-0 nova_compute[192716]: 2025-10-07 22:02:38.247 2 DEBUG nova.compute.manager [-] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:02:38 compute-0 nova_compute[192716]: 2025-10-07 22:02:38.247 2 DEBUG nova.network.neutron [-] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:02:38 compute-0 nova_compute[192716]: 2025-10-07 22:02:38.248 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:38 compute-0 nova_compute[192716]: 2025-10-07 22:02:38.772 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:02:38 compute-0 podman[220437]: 2025-10-07 22:02:38.884405482 +0000 UTC m=+0.116115300 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 22:02:39 compute-0 nova_compute[192716]: 2025-10-07 22:02:39.463 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance fd9b0d7e-e882-4574-9e62-a1d142bb6a16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:02:39 compute-0 nova_compute[192716]: 2025-10-07 22:02:39.463 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:02:39 compute-0 nova_compute[192716]: 2025-10-07 22:02:39.463 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:02:37 up  1:11,  0 user,  load average: 0.30, 0.21, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_3ad27d63f39845acba6b21828806b82a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:02:39 compute-0 nova_compute[192716]: 2025-10-07 22:02:39.502 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:02:39 compute-0 nova_compute[192716]: 2025-10-07 22:02:39.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.010 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.034 2 DEBUG nova.compute.manager [req-eb7301f2-2fb4-4716-b18d-2aeba024d883 req-6f4842b4-459c-4c26-a167-ed43d9bc18b8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Received event network-vif-unplugged-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.034 2 DEBUG oslo_concurrency.lockutils [req-eb7301f2-2fb4-4716-b18d-2aeba024d883 req-6f4842b4-459c-4c26-a167-ed43d9bc18b8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.034 2 DEBUG oslo_concurrency.lockutils [req-eb7301f2-2fb4-4716-b18d-2aeba024d883 req-6f4842b4-459c-4c26-a167-ed43d9bc18b8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.034 2 DEBUG oslo_concurrency.lockutils [req-eb7301f2-2fb4-4716-b18d-2aeba024d883 req-6f4842b4-459c-4c26-a167-ed43d9bc18b8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.035 2 DEBUG nova.compute.manager [req-eb7301f2-2fb4-4716-b18d-2aeba024d883 req-6f4842b4-459c-4c26-a167-ed43d9bc18b8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] No waiting events found dispatching network-vif-unplugged-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.035 2 DEBUG nova.compute.manager [req-eb7301f2-2fb4-4716-b18d-2aeba024d883 req-6f4842b4-459c-4c26-a167-ed43d9bc18b8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Received event network-vif-unplugged-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.116 2 DEBUG nova.compute.manager [req-5ffa1d8f-18ec-4604-bb05-4cdc976c4267 req-d51d4777-c65d-4f4a-b9ea-2f23c7b7826f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Received event network-vif-deleted-c4e362fa-a4d3-43d8-80c2-29fa599f7f3f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.117 2 INFO nova.compute.manager [req-5ffa1d8f-18ec-4604-bb05-4cdc976c4267 req-d51d4777-c65d-4f4a-b9ea-2f23c7b7826f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Neutron deleted interface c4e362fa-a4d3-43d8-80c2-29fa599f7f3f; detaching it from the instance and deleting it from the info cache
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.117 2 DEBUG nova.network.neutron [req-5ffa1d8f-18ec-4604-bb05-4cdc976c4267 req-d51d4777-c65d-4f4a-b9ea-2f23c7b7826f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.519 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.520 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.612s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.521 2 DEBUG nova.network.neutron [-] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:02:40 compute-0 nova_compute[192716]: 2025-10-07 22:02:40.626 2 DEBUG nova.compute.manager [req-5ffa1d8f-18ec-4604-bb05-4cdc976c4267 req-d51d4777-c65d-4f4a-b9ea-2f23c7b7826f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Detach interface failed, port_id=c4e362fa-a4d3-43d8-80c2-29fa599f7f3f, reason: Instance fd9b0d7e-e882-4574-9e62-a1d142bb6a16 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:02:41 compute-0 nova_compute[192716]: 2025-10-07 22:02:41.029 2 INFO nova.compute.manager [-] [instance: fd9b0d7e-e882-4574-9e62-a1d142bb6a16] Took 2.78 seconds to deallocate network for instance.
Oct 07 22:02:41 compute-0 nova_compute[192716]: 2025-10-07 22:02:41.524 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:02:41 compute-0 nova_compute[192716]: 2025-10-07 22:02:41.561 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:02:41 compute-0 nova_compute[192716]: 2025-10-07 22:02:41.562 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:02:41 compute-0 nova_compute[192716]: 2025-10-07 22:02:41.599 2 DEBUG nova.compute.provider_tree [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:02:41 compute-0 podman[220456]: 2025-10-07 22:02:41.831815979 +0000 UTC m=+0.073833627 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm)
Oct 07 22:02:42 compute-0 nova_compute[192716]: 2025-10-07 22:02:42.105 2 DEBUG nova.scheduler.client.report [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:02:42 compute-0 nova_compute[192716]: 2025-10-07 22:02:42.616 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.054s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:42 compute-0 nova_compute[192716]: 2025-10-07 22:02:42.642 2 INFO nova.scheduler.client.report [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Deleted allocations for instance fd9b0d7e-e882-4574-9e62-a1d142bb6a16
Oct 07 22:02:42 compute-0 nova_compute[192716]: 2025-10-07 22:02:42.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:43 compute-0 nova_compute[192716]: 2025-10-07 22:02:43.676 2 DEBUG oslo_concurrency.lockutils [None req-cf455b28-3040-4b6a-997b-6a63df9a0191 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "fd9b0d7e-e882-4574-9e62-a1d142bb6a16" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.283s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:02:44 compute-0 nova_compute[192716]: 2025-10-07 22:02:44.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:47 compute-0 nova_compute[192716]: 2025-10-07 22:02:47.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:49 compute-0 nova_compute[192716]: 2025-10-07 22:02:49.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:52 compute-0 nova_compute[192716]: 2025-10-07 22:02:52.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:54 compute-0 nova_compute[192716]: 2025-10-07 22:02:54.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:56 compute-0 podman[220479]: 2025-10-07 22:02:56.847305221 +0000 UTC m=+0.075324430 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible)
Oct 07 22:02:56 compute-0 podman[220480]: 2025-10-07 22:02:56.859953467 +0000 UTC m=+0.090936462 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4)
Oct 07 22:02:57 compute-0 nova_compute[192716]: 2025-10-07 22:02:57.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:59 compute-0 nova_compute[192716]: 2025-10-07 22:02:59.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:02:59 compute-0 podman[203153]: time="2025-10-07T22:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:02:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:02:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 07 22:02:59 compute-0 podman[220516]: 2025-10-07 22:02:59.815964903 +0000 UTC m=+0.056902337 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: ERROR   22:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: ERROR   22:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: ERROR   22:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: ERROR   22:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: ERROR   22:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:03:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:03:02 compute-0 nova_compute[192716]: 2025-10-07 22:03:02.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:04 compute-0 nova_compute[192716]: 2025-10-07 22:03:04.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:06 compute-0 podman[220542]: 2025-10-07 22:03:06.887177221 +0000 UTC m=+0.122491195 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:03:07 compute-0 nova_compute[192716]: 2025-10-07 22:03:07.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:09 compute-0 nova_compute[192716]: 2025-10-07 22:03:09.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:09 compute-0 podman[220569]: 2025-10-07 22:03:09.826997269 +0000 UTC m=+0.068850203 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:03:10 compute-0 nova_compute[192716]: 2025-10-07 22:03:10.254 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:10 compute-0 nova_compute[192716]: 2025-10-07 22:03:10.255 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:10 compute-0 nova_compute[192716]: 2025-10-07 22:03:10.763 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 22:03:11 compute-0 nova_compute[192716]: 2025-10-07 22:03:11.322 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:11 compute-0 nova_compute[192716]: 2025-10-07 22:03:11.323 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:11 compute-0 nova_compute[192716]: 2025-10-07 22:03:11.331 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 22:03:11 compute-0 nova_compute[192716]: 2025-10-07 22:03:11.331 2 INFO nova.compute.claims [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Claim successful on node compute-0.ctlplane.example.com
Oct 07 22:03:12 compute-0 nova_compute[192716]: 2025-10-07 22:03:12.478 2 DEBUG nova.compute.provider_tree [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:03:12 compute-0 nova_compute[192716]: 2025-10-07 22:03:12.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:12 compute-0 podman[220588]: 2025-10-07 22:03:12.817051369 +0000 UTC m=+0.052849950 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:03:12 compute-0 nova_compute[192716]: 2025-10-07 22:03:12.988 2 DEBUG nova.scheduler.client.report [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:03:13 compute-0 nova_compute[192716]: 2025-10-07 22:03:13.499 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.176s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:13 compute-0 nova_compute[192716]: 2025-10-07 22:03:13.500 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 22:03:14 compute-0 nova_compute[192716]: 2025-10-07 22:03:14.011 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 22:03:14 compute-0 nova_compute[192716]: 2025-10-07 22:03:14.012 2 DEBUG nova.network.neutron [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 22:03:14 compute-0 nova_compute[192716]: 2025-10-07 22:03:14.013 2 WARNING neutronclient.v2_0.client [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:14 compute-0 nova_compute[192716]: 2025-10-07 22:03:14.013 2 WARNING neutronclient.v2_0.client [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:14 compute-0 nova_compute[192716]: 2025-10-07 22:03:14.525 2 INFO nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 22:03:14 compute-0 nova_compute[192716]: 2025-10-07 22:03:14.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:15 compute-0 nova_compute[192716]: 2025-10-07 22:03:15.033 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 22:03:15 compute-0 nova_compute[192716]: 2025-10-07 22:03:15.085 2 DEBUG nova.network.neutron [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Successfully created port: 3fa2f6fa-5235-40b1-95d4-a5750a801212 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.053 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.055 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.056 2 INFO nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Creating image(s)
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.056 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "/var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.057 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "/var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.058 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "/var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.059 2 DEBUG oslo_utils.imageutils.format_inspector [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.066 2 DEBUG oslo_utils.imageutils.format_inspector [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.069 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.163 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.164 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.165 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.165 2 DEBUG oslo_utils.imageutils.format_inspector [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.169 2 DEBUG oslo_utils.imageutils.format_inspector [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.169 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.239 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.241 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.278 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.279 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.279 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.347 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.348 2 DEBUG nova.virt.disk.api [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Checking if we can resize image /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.349 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.421 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.422 2 DEBUG nova.virt.disk.api [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Cannot resize image /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.423 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.423 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Ensure instance console log exists: /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.424 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.424 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:16 compute-0 nova_compute[192716]: 2025-10-07 22:03:16.424 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:17 compute-0 nova_compute[192716]: 2025-10-07 22:03:17.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:17 compute-0 nova_compute[192716]: 2025-10-07 22:03:17.978 2 DEBUG nova.network.neutron [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Successfully updated port: 3fa2f6fa-5235-40b1-95d4-a5750a801212 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.050 2 DEBUG nova.compute.manager [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-changed-3fa2f6fa-5235-40b1-95d4-a5750a801212 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.051 2 DEBUG nova.compute.manager [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Refreshing instance network info cache due to event network-changed-3fa2f6fa-5235-40b1-95d4-a5750a801212. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.051 2 DEBUG oslo_concurrency.lockutils [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-d237b0bb-037e-4864-9f2b-3cd5343c9b1a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.051 2 DEBUG oslo_concurrency.lockutils [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-d237b0bb-037e-4864-9f2b-3cd5343c9b1a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.051 2 DEBUG nova.network.neutron [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Refreshing network info cache for port 3fa2f6fa-5235-40b1-95d4-a5750a801212 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.485 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "refresh_cache-d237b0bb-037e-4864-9f2b-3cd5343c9b1a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.558 2 WARNING neutronclient.v2_0.client [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.739 2 DEBUG nova.network.neutron [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:03:18 compute-0 nova_compute[192716]: 2025-10-07 22:03:18.967 2 DEBUG nova.network.neutron [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:03:19 compute-0 nova_compute[192716]: 2025-10-07 22:03:19.474 2 DEBUG oslo_concurrency.lockutils [req-c4f4d90d-4288-4906-a9cb-18b0ea777a0c req-77ee2526-8371-4c23-834c-4ae882f8d83d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-d237b0bb-037e-4864-9f2b-3cd5343c9b1a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:03:19 compute-0 nova_compute[192716]: 2025-10-07 22:03:19.476 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquired lock "refresh_cache-d237b0bb-037e-4864-9f2b-3cd5343c9b1a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:03:19 compute-0 nova_compute[192716]: 2025-10-07 22:03:19.476 2 DEBUG nova.network.neutron [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:03:19 compute-0 nova_compute[192716]: 2025-10-07 22:03:19.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:20 compute-0 nova_compute[192716]: 2025-10-07 22:03:20.331 2 DEBUG nova.network.neutron [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:03:20 compute-0 nova_compute[192716]: 2025-10-07 22:03:20.583 2 WARNING neutronclient.v2_0.client [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:20 compute-0 nova_compute[192716]: 2025-10-07 22:03:20.775 2 DEBUG nova.network.neutron [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Updating instance_info_cache with network_info: [{"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.282 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Releasing lock "refresh_cache-d237b0bb-037e-4864-9f2b-3cd5343c9b1a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.283 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Instance network_info: |[{"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.287 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Start _get_guest_xml network_info=[{"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.293 2 WARNING nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.295 2 DEBUG nova.virt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-156515144', uuid='d237b0bb-037e-4864-9f2b-3cd5343c9b1a'), owner=OwnerMeta(userid='db99335261504aa7b84c7d30ec17d679', username='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin', projectid='3ad27d63f39845acba6b21828806b82a', projectname='tempest-TestExecuteHostMaintenanceStrategy-152687663'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874601.2956991) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.300 2 DEBUG nova.virt.libvirt.host [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.301 2 DEBUG nova.virt.libvirt.host [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.305 2 DEBUG nova.virt.libvirt.host [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.306 2 DEBUG nova.virt.libvirt.host [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.306 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.307 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.308 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.308 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.309 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.309 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.310 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.310 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.310 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.311 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.311 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.312 2 DEBUG nova.virt.hardware [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.318 2 DEBUG nova.virt.libvirt.vif [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:03:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-156515144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-156515144',id=15,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-vaeio20y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:03:15Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=d237b0bb-037e-4864-9f2b-3cd5343c9b1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.319 2 DEBUG nova.network.os_vif_util [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.320 2 DEBUG nova.network.os_vif_util [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.322 2 DEBUG nova.objects.instance [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lazy-loading 'pci_devices' on Instance uuid d237b0bb-037e-4864-9f2b-3cd5343c9b1a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.831 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] End _get_guest_xml xml=<domain type="kvm">
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <uuid>d237b0bb-037e-4864-9f2b-3cd5343c9b1a</uuid>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <name>instance-0000000f</name>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-156515144</nova:name>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:03:21</nova:creationTime>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:03:21 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:03:21 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:user uuid="db99335261504aa7b84c7d30ec17d679">tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin</nova:user>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:project uuid="3ad27d63f39845acba6b21828806b82a">tempest-TestExecuteHostMaintenanceStrategy-152687663</nova:project>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         <nova:port uuid="3fa2f6fa-5235-40b1-95d4-a5750a801212">
Oct 07 22:03:21 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <system>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <entry name="serial">d237b0bb-037e-4864-9f2b-3cd5343c9b1a</entry>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <entry name="uuid">d237b0bb-037e-4864-9f2b-3cd5343c9b1a</entry>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </system>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <os>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </os>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <features>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </features>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.config"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:7a:7b:28"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <target dev="tap3fa2f6fa-52"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </interface>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/console.log" append="off"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <video>
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </video>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:03:21 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:03:21 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:03:21 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:03:21 compute-0 nova_compute[192716]: </domain>
Oct 07 22:03:21 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.833 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Preparing to wait for external event network-vif-plugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.834 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.834 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.835 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.836 2 DEBUG nova.virt.libvirt.vif [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:03:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-156515144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-156515144',id=15,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-vaeio20y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:03:15Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=d237b0bb-037e-4864-9f2b-3cd5343c9b1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.837 2 DEBUG nova.network.os_vif_util [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.838 2 DEBUG nova.network.os_vif_util [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.839 2 DEBUG os_vif [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ce77825c-450d-50e1-9ab2-6733264638e4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3fa2f6fa-52, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.851 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3fa2f6fa-52, col_values=(('qos', UUID('c859c2c9-5762-4262-b94a-b95045b21884')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.851 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3fa2f6fa-52, col_values=(('external_ids', {'iface-id': '3fa2f6fa-5235-40b1-95d4-a5750a801212', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:7b:28', 'vm-uuid': 'd237b0bb-037e-4864-9f2b-3cd5343c9b1a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:21 compute-0 NetworkManager[51722]: <info>  [1759874601.8543] manager: (tap3fa2f6fa-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:21 compute-0 nova_compute[192716]: 2025-10-07 22:03:21.865 2 INFO os_vif [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52')
Oct 07 22:03:23 compute-0 nova_compute[192716]: 2025-10-07 22:03:23.417 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:03:23 compute-0 nova_compute[192716]: 2025-10-07 22:03:23.418 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:03:23 compute-0 nova_compute[192716]: 2025-10-07 22:03:23.418 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] No VIF found with MAC fa:16:3e:7a:7b:28, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 22:03:23 compute-0 nova_compute[192716]: 2025-10-07 22:03:23.418 2 INFO nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Using config drive
Oct 07 22:03:23 compute-0 nova_compute[192716]: 2025-10-07 22:03:23.929 2 WARNING neutronclient.v2_0.client [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.100 2 INFO nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Creating config drive at /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.config
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.109 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpmxurnzft execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.240 2 DEBUG oslo_concurrency.processutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpmxurnzft" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:24 compute-0 kernel: tap3fa2f6fa-52: entered promiscuous mode
Oct 07 22:03:24 compute-0 NetworkManager[51722]: <info>  [1759874604.3313] manager: (tap3fa2f6fa-52): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Oct 07 22:03:24 compute-0 ovn_controller[94904]: 2025-10-07T22:03:24Z|00135|binding|INFO|Claiming lport 3fa2f6fa-5235-40b1-95d4-a5750a801212 for this chassis.
Oct 07 22:03:24 compute-0 ovn_controller[94904]: 2025-10-07T22:03:24Z|00136|binding|INFO|3fa2f6fa-5235-40b1-95d4-a5750a801212: Claiming fa:16:3e:7a:7b:28 10.100.0.10
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 ovn_controller[94904]: 2025-10-07T22:03:24Z|00137|binding|INFO|Setting lport 3fa2f6fa-5235-40b1-95d4-a5750a801212 ovn-installed in OVS
Oct 07 22:03:24 compute-0 ovn_controller[94904]: 2025-10-07T22:03:24Z|00138|binding|INFO|Setting lport 3fa2f6fa-5235-40b1-95d4-a5750a801212 up in Southbound
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.347 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:7b:28 10.100.0.10'], port_security=['fa:16:3e:7a:7b:28 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd237b0bb-037e-4864-9f2b-3cd5343c9b1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=3fa2f6fa-5235-40b1-95d4-a5750a801212) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.348 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 3fa2f6fa-5235-40b1-95d4-a5750a801212 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 bound to our chassis
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.350 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.369 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf61d95-e123-4b99-a86f-fe02b944b442]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.370 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap726154fe-b1 in ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.372 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap726154fe-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.372 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e18640c5-ea87-4d3a-a5b1-2d1cc3ca736d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.373 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed2af24-c848-4979-82d3-cdc0c71a73dd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 systemd-machined[152719]: New machine qemu-10-instance-0000000f.
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.391 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[96d20875-b8a6-4fdd-a264-3fad28651f73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000f.
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.411 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8fda03-5a3e-42ed-be7c-a71731e976b0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 systemd-udevd[220649]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:03:24 compute-0 NetworkManager[51722]: <info>  [1759874604.4433] device (tap3fa2f6fa-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:03:24 compute-0 NetworkManager[51722]: <info>  [1759874604.4452] device (tap3fa2f6fa-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.452 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[95dba9e9-13cd-49e8-adef-132e0a1c3750]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.456 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[22ccebdd-7c49-49fb-8fd0-574dde851925]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 NetworkManager[51722]: <info>  [1759874604.4578] manager: (tap726154fe-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.494 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[a6dab933-30bc-4227-8eb8-509b3f998986]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.498 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2daa7fe7-f520-432f-b68a-a4cf25a05b9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 NetworkManager[51722]: <info>  [1759874604.5219] device (tap726154fe-b0): carrier: link connected
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.529 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b5a1b4-075d-41d2-a623-c962f297632e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.531 2 DEBUG nova.compute.manager [req-a48afe92-23a7-46c1-8406-49df2eb47c6d req-def7ca0e-7a4c-4a75-a199-8777589cffd6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-plugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.532 2 DEBUG oslo_concurrency.lockutils [req-a48afe92-23a7-46c1-8406-49df2eb47c6d req-def7ca0e-7a4c-4a75-a199-8777589cffd6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.532 2 DEBUG oslo_concurrency.lockutils [req-a48afe92-23a7-46c1-8406-49df2eb47c6d req-def7ca0e-7a4c-4a75-a199-8777589cffd6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.532 2 DEBUG oslo_concurrency.lockutils [req-a48afe92-23a7-46c1-8406-49df2eb47c6d req-def7ca0e-7a4c-4a75-a199-8777589cffd6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.533 2 DEBUG nova.compute.manager [req-a48afe92-23a7-46c1-8406-49df2eb47c6d req-def7ca0e-7a4c-4a75-a199-8777589cffd6 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Processing event network-vif-plugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.546 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc40f1c-a467-414e-b736-f0e9c0e67a8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433810, 'reachable_time': 38236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220679, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.563 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f42fd42e-69a1-40be-b5ab-f67bacf240fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:b22e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433810, 'tstamp': 433810}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220680, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.584 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[425c45e8-239a-4d97-a60a-67e33b97519e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433810, 'reachable_time': 38236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220681, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.633 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[92e29647-6efe-4e88-ba9a-130eab2dbb07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.717 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[cf36eac0-56cc-4724-a260-745346c90ae4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.718 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.718 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.718 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 NetworkManager[51722]: <info>  [1759874604.7216] manager: (tap726154fe-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Oct 07 22:03:24 compute-0 kernel: tap726154fe-b0: entered promiscuous mode
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.725 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 ovn_controller[94904]: 2025-10-07T22:03:24Z|00139|binding|INFO|Releasing lport b6dfddd4-019f-4508-ab9b-37759605366f from this chassis (sb_readonly=0)
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 nova_compute[192716]: 2025-10-07 22:03:24.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.754 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[63522fbb-2c8e-471a-b163-76e84bb38a92]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.755 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.755 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.755 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 726154fe-bda6-431d-b983-7caa973a9e17 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.756 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.759 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[56eca90d-fdad-4fc2-a70b-7909bdc7b561]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.760 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.760 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5752df36-705c-47d1-b99a-9289b0d16d65]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.761 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:03:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:24.761 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'env', 'PROCESS_TAG=haproxy-726154fe-bda6-431d-b983-7caa973a9e17', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/726154fe-bda6-431d-b983-7caa973a9e17.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:03:25 compute-0 podman[220720]: 2025-10-07 22:03:25.170039217 +0000 UTC m=+0.050562494 container create d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 07 22:03:25 compute-0 systemd[1]: Started libpod-conmon-d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba.scope.
Oct 07 22:03:25 compute-0 podman[220720]: 2025-10-07 22:03:25.142110789 +0000 UTC m=+0.022634076 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:03:25 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:03:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/425525147de1bd309e2ffcbf4b6e919280d243c81ab8da590014e3a4513f993f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:03:25 compute-0 podman[220720]: 2025-10-07 22:03:25.266054165 +0000 UTC m=+0.146577452 container init d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true)
Oct 07 22:03:25 compute-0 podman[220720]: 2025-10-07 22:03:25.276084205 +0000 UTC m=+0.156607462 container start d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, tcib_managed=true, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Oct 07 22:03:25 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [NOTICE]   (220739) : New worker (220741) forked
Oct 07 22:03:25 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [NOTICE]   (220739) : Loading success.
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.399 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.404 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.409 2 INFO nova.virt.libvirt.driver [-] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Instance spawned successfully.
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.409 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 22:03:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:25.627 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:25.628 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:25.628 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.928 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.929 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.930 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.930 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.931 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:03:25 compute-0 nova_compute[192716]: 2025-10-07 22:03:25.932 2 DEBUG nova.virt.libvirt.driver [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.443 2 INFO nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Took 10.39 seconds to spawn the instance on the hypervisor.
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.443 2 DEBUG nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.614 2 DEBUG nova.compute.manager [req-51d9d261-9638-4a38-8414-d01ee87b1a7f req-817e2b34-db41-4e4b-ac7d-106e38356130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-plugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.615 2 DEBUG oslo_concurrency.lockutils [req-51d9d261-9638-4a38-8414-d01ee87b1a7f req-817e2b34-db41-4e4b-ac7d-106e38356130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.615 2 DEBUG oslo_concurrency.lockutils [req-51d9d261-9638-4a38-8414-d01ee87b1a7f req-817e2b34-db41-4e4b-ac7d-106e38356130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.615 2 DEBUG oslo_concurrency.lockutils [req-51d9d261-9638-4a38-8414-d01ee87b1a7f req-817e2b34-db41-4e4b-ac7d-106e38356130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.616 2 DEBUG nova.compute.manager [req-51d9d261-9638-4a38-8414-d01ee87b1a7f req-817e2b34-db41-4e4b-ac7d-106e38356130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] No waiting events found dispatching network-vif-plugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.616 2 WARNING nova.compute.manager [req-51d9d261-9638-4a38-8414-d01ee87b1a7f req-817e2b34-db41-4e4b-ac7d-106e38356130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received unexpected event network-vif-plugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 for instance with vm_state active and task_state None.
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.971 2 INFO nova.compute.manager [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Took 15.69 seconds to build instance.
Oct 07 22:03:26 compute-0 nova_compute[192716]: 2025-10-07 22:03:26.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:27 compute-0 nova_compute[192716]: 2025-10-07 22:03:27.475 2 DEBUG oslo_concurrency.lockutils [None req-921d2d5a-794d-45b0-baea-a3976c534fb5 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:27 compute-0 podman[220751]: 2025-10-07 22:03:27.816999932 +0000 UTC m=+0.056948338 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:03:27 compute-0 podman[220752]: 2025-10-07 22:03:27.846018182 +0000 UTC m=+0.074711873 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 22:03:28 compute-0 nova_compute[192716]: 2025-10-07 22:03:28.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:28 compute-0 nova_compute[192716]: 2025-10-07 22:03:28.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:03:29 compute-0 podman[203153]: time="2025-10-07T22:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:03:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:03:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3483 "" "Go-http-client/1.1"
Oct 07 22:03:29 compute-0 nova_compute[192716]: 2025-10-07 22:03:29.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:30 compute-0 podman[220792]: 2025-10-07 22:03:30.85292265 +0000 UTC m=+0.078075709 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: ERROR   22:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: ERROR   22:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: ERROR   22:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: ERROR   22:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: ERROR   22:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:03:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:03:31 compute-0 nova_compute[192716]: 2025-10-07 22:03:31.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:31 compute-0 nova_compute[192716]: 2025-10-07 22:03:31.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:31 compute-0 nova_compute[192716]: 2025-10-07 22:03:31.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:34 compute-0 nova_compute[192716]: 2025-10-07 22:03:34.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:34 compute-0 nova_compute[192716]: 2025-10-07 22:03:34.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:35 compute-0 nova_compute[192716]: 2025-10-07 22:03:35.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:36 compute-0 nova_compute[192716]: 2025-10-07 22:03:36.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:36 compute-0 nova_compute[192716]: 2025-10-07 22:03:36.513 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:36 compute-0 nova_compute[192716]: 2025-10-07 22:03:36.513 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:36 compute-0 nova_compute[192716]: 2025-10-07 22:03:36.513 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:03:36 compute-0 nova_compute[192716]: 2025-10-07 22:03:36.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:37 compute-0 ovn_controller[94904]: 2025-10-07T22:03:37Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:7b:28 10.100.0.10
Oct 07 22:03:37 compute-0 ovn_controller[94904]: 2025-10-07T22:03:37Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:7b:28 10.100.0.10
Oct 07 22:03:37 compute-0 nova_compute[192716]: 2025-10-07 22:03:37.561 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:37 compute-0 nova_compute[192716]: 2025-10-07 22:03:37.647 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:37 compute-0 nova_compute[192716]: 2025-10-07 22:03:37.649 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:37 compute-0 nova_compute[192716]: 2025-10-07 22:03:37.743 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:37 compute-0 podman[220833]: 2025-10-07 22:03:37.965398872 +0000 UTC m=+0.194791596 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 07 22:03:37 compute-0 nova_compute[192716]: 2025-10-07 22:03:37.997 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:03:37 compute-0 nova_compute[192716]: 2025-10-07 22:03:37.999 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:38 compute-0 nova_compute[192716]: 2025-10-07 22:03:38.026 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:38 compute-0 nova_compute[192716]: 2025-10-07 22:03:38.027 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5652MB free_disk=73.27552032470703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:03:38 compute-0 nova_compute[192716]: 2025-10-07 22:03:38.028 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:38 compute-0 nova_compute[192716]: 2025-10-07 22:03:38.028 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:39 compute-0 nova_compute[192716]: 2025-10-07 22:03:39.084 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance d237b0bb-037e-4864-9f2b-3cd5343c9b1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:03:39 compute-0 nova_compute[192716]: 2025-10-07 22:03:39.085 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:03:39 compute-0 nova_compute[192716]: 2025-10-07 22:03:39.085 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:03:38 up  1:12,  0 user,  load average: 0.32, 0.22, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3ad27d63f39845acba6b21828806b82a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:03:39 compute-0 nova_compute[192716]: 2025-10-07 22:03:39.129 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:03:39 compute-0 nova_compute[192716]: 2025-10-07 22:03:39.641 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:03:39 compute-0 nova_compute[192716]: 2025-10-07 22:03:39.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:40 compute-0 nova_compute[192716]: 2025-10-07 22:03:40.151 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:03:40 compute-0 nova_compute[192716]: 2025-10-07 22:03:40.151 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:40 compute-0 podman[220862]: 2025-10-07 22:03:40.851563888 +0000 UTC m=+0.088701307 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:03:41 compute-0 nova_compute[192716]: 2025-10-07 22:03:41.153 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:41 compute-0 nova_compute[192716]: 2025-10-07 22:03:41.153 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:41 compute-0 nova_compute[192716]: 2025-10-07 22:03:41.153 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:03:41 compute-0 nova_compute[192716]: 2025-10-07 22:03:41.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:43 compute-0 podman[220882]: 2025-10-07 22:03:43.834966336 +0000 UTC m=+0.076301538 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 07 22:03:44 compute-0 nova_compute[192716]: 2025-10-07 22:03:44.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:45 compute-0 nova_compute[192716]: 2025-10-07 22:03:45.716 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Creating tmpfile /var/lib/nova/instances/tmpzkfxs280 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:03:45 compute-0 nova_compute[192716]: 2025-10-07 22:03:45.718 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:45 compute-0 nova_compute[192716]: 2025-10-07 22:03:45.721 2 DEBUG nova.compute.manager [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkfxs280',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:03:46 compute-0 nova_compute[192716]: 2025-10-07 22:03:46.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:47 compute-0 nova_compute[192716]: 2025-10-07 22:03:47.764 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:49 compute-0 nova_compute[192716]: 2025-10-07 22:03:49.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:51 compute-0 nova_compute[192716]: 2025-10-07 22:03:51.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:52 compute-0 nova_compute[192716]: 2025-10-07 22:03:52.019 2 DEBUG nova.compute.manager [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkfxs280',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4901d3e6-68ac-4c50-9462-ba7192c80bf4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:03:53 compute-0 nova_compute[192716]: 2025-10-07 22:03:53.030 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-4901d3e6-68ac-4c50-9462-ba7192c80bf4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:03:53 compute-0 nova_compute[192716]: 2025-10-07 22:03:53.031 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-4901d3e6-68ac-4c50-9462-ba7192c80bf4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:03:53 compute-0 nova_compute[192716]: 2025-10-07 22:03:53.031 2 DEBUG nova.network.neutron [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:03:53 compute-0 nova_compute[192716]: 2025-10-07 22:03:53.539 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:54 compute-0 ovn_controller[94904]: 2025-10-07T22:03:54Z|00140|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Oct 07 22:03:54 compute-0 nova_compute[192716]: 2025-10-07 22:03:54.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:54 compute-0 nova_compute[192716]: 2025-10-07 22:03:54.983 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.045 2 DEBUG nova.network.neutron [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Updating instance_info_cache with network_info: [{"id": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "address": "fa:16:3e:bb:8f:4b", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c5a73f-18", "ovs_interfaceid": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.551 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-4901d3e6-68ac-4c50-9462-ba7192c80bf4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.565 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkfxs280',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4901d3e6-68ac-4c50-9462-ba7192c80bf4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.566 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Creating instance directory: /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.567 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Creating disk.info with the contents: {'/var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk': 'qcow2', '/var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.567 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.568 2 DEBUG nova.objects.instance [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4901d3e6-68ac-4c50-9462-ba7192c80bf4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:03:56 compute-0 nova_compute[192716]: 2025-10-07 22:03:56.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.074 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.083 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.086 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.176 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.177 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.178 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.179 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.182 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.183 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.249 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.250 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.291 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.292 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.293 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.356 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.357 2 DEBUG nova.virt.disk.api [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.358 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.415 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.416 2 DEBUG nova.virt.disk.api [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.416 2 DEBUG nova.objects.instance [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 4901d3e6-68ac-4c50-9462-ba7192c80bf4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.924 2 DEBUG nova.objects.base [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<4901d3e6-68ac-4c50-9462-ba7192c80bf4> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.925 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.953 2 DEBUG oslo_concurrency.processutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4/disk.config 497664" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.954 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.956 2 DEBUG nova.virt.libvirt.vif [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:02:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1943482815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1943482815',id=14,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:03:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-5b04ewyi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:03:04Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=4901d3e6-68ac-4c50-9462-ba7192c80bf4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "address": "fa:16:3e:bb:8f:4b", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc2c5a73f-18", "ovs_interfaceid": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.956 2 DEBUG nova.network.os_vif_util [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "address": "fa:16:3e:bb:8f:4b", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc2c5a73f-18", "ovs_interfaceid": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.958 2 DEBUG nova.network.os_vif_util [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:8f:4b,bridge_name='br-int',has_traffic_filtering=True,id=c2c5a73f-18c6-4300-86ed-9f2441645bb3,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c5a73f-18') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.958 2 DEBUG os_vif [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:8f:4b,bridge_name='br-int',has_traffic_filtering=True,id=c2c5a73f-18c6-4300-86ed-9f2441645bb3,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c5a73f-18') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.962 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f45078f4-6713-56f4-9019-ad7b170f44d9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2c5a73f-18, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc2c5a73f-18, col_values=(('qos', UUID('8dc1aa47-defb-4363-bddc-4919468d87d0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.969 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc2c5a73f-18, col_values=(('external_ids', {'iface-id': 'c2c5a73f-18c6-4300-86ed-9f2441645bb3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:8f:4b', 'vm-uuid': '4901d3e6-68ac-4c50-9462-ba7192c80bf4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 NetworkManager[51722]: <info>  [1759874637.9717] manager: (tapc2c5a73f-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.981 2 INFO os_vif [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:8f:4b,bridge_name='br-int',has_traffic_filtering=True,id=c2c5a73f-18c6-4300-86ed-9f2441645bb3,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c5a73f-18')
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.982 2 DEBUG nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.982 2 DEBUG nova.compute.manager [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkfxs280',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4901d3e6-68ac-4c50-9462-ba7192c80bf4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:03:57 compute-0 nova_compute[192716]: 2025-10-07 22:03:57.983 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:58 compute-0 nova_compute[192716]: 2025-10-07 22:03:58.049 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:03:58 compute-0 nova_compute[192716]: 2025-10-07 22:03:58.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:58 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:58.786 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:03:58 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:03:58.786 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:03:58 compute-0 podman[220925]: 2025-10-07 22:03:58.855075283 +0000 UTC m=+0.082210820 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 22:03:58 compute-0 podman[220926]: 2025-10-07 22:03:58.893537595 +0000 UTC m=+0.117028946 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 07 22:03:59 compute-0 nova_compute[192716]: 2025-10-07 22:03:59.153 2 DEBUG nova.network.neutron [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Port c2c5a73f-18c6-4300-86ed-9f2441645bb3 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:03:59 compute-0 nova_compute[192716]: 2025-10-07 22:03:59.170 2 DEBUG nova.compute.manager [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkfxs280',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4901d3e6-68ac-4c50-9462-ba7192c80bf4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:03:59 compute-0 podman[203153]: time="2025-10-07T22:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:03:59 compute-0 nova_compute[192716]: 2025-10-07 22:03:59.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:03:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:03:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3481 "" "Go-http-client/1.1"
Oct 07 22:04:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:00.787 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: ERROR   22:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: ERROR   22:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: ERROR   22:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: ERROR   22:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: ERROR   22:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:04:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:04:01 compute-0 podman[220964]: 2025-10-07 22:04:01.830667616 +0000 UTC m=+0.071710086 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:04:02 compute-0 kernel: tapc2c5a73f-18: entered promiscuous mode
Oct 07 22:04:02 compute-0 NetworkManager[51722]: <info>  [1759874642.1750] manager: (tapc2c5a73f-18): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:02 compute-0 ovn_controller[94904]: 2025-10-07T22:04:02Z|00141|binding|INFO|Claiming lport c2c5a73f-18c6-4300-86ed-9f2441645bb3 for this additional chassis.
Oct 07 22:04:02 compute-0 ovn_controller[94904]: 2025-10-07T22:04:02Z|00142|binding|INFO|c2c5a73f-18c6-4300-86ed-9f2441645bb3: Claiming fa:16:3e:bb:8f:4b 10.100.0.9
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.184 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:8f:4b 10.100.0.9'], port_security=['fa:16:3e:bb:8f:4b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4901d3e6-68ac-4c50-9462-ba7192c80bf4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c2c5a73f-18c6-4300-86ed-9f2441645bb3) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.184 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c2c5a73f-18c6-4300-86ed-9f2441645bb3 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.185 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:02 compute-0 ovn_controller[94904]: 2025-10-07T22:04:02Z|00143|binding|INFO|Setting lport c2c5a73f-18c6-4300-86ed-9f2441645bb3 ovn-installed in OVS
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.199 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[28f13e63-2639-4553-8c84-dc8eb5e0d7e6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 systemd-udevd[221002]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:04:02 compute-0 NetworkManager[51722]: <info>  [1759874642.2258] device (tapc2c5a73f-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:04:02 compute-0 NetworkManager[51722]: <info>  [1759874642.2269] device (tapc2c5a73f-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:04:02 compute-0 systemd-machined[152719]: New machine qemu-11-instance-0000000e.
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.229 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[21452fae-f7c9-4c28-b1a6-993290a90b19]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.232 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[1d892c34-df45-44aa-a5ee-484630a1902f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000e.
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.269 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[53d8b079-2963-4494-a79a-ab4137a90f95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.290 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fa4b1eba-38c2-4ae0-8711-a167303c83df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433810, 'reachable_time': 38236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221011, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.310 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9148cf-e48a-444c-9ce9-8c967070e0b3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433824, 'tstamp': 433824}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221016, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433828, 'tstamp': 433828}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221016, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.312 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.315 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.315 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.315 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.316 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:04:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:02.317 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ff9f8b-0682-4a71-bc45-dca30e35c1c6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:02 compute-0 nova_compute[192716]: 2025-10-07 22:04:02.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:04 compute-0 nova_compute[192716]: 2025-10-07 22:04:04.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:04 compute-0 ovn_controller[94904]: 2025-10-07T22:04:04Z|00144|binding|INFO|Claiming lport c2c5a73f-18c6-4300-86ed-9f2441645bb3 for this chassis.
Oct 07 22:04:04 compute-0 ovn_controller[94904]: 2025-10-07T22:04:04Z|00145|binding|INFO|c2c5a73f-18c6-4300-86ed-9f2441645bb3: Claiming fa:16:3e:bb:8f:4b 10.100.0.9
Oct 07 22:04:04 compute-0 ovn_controller[94904]: 2025-10-07T22:04:04Z|00146|binding|INFO|Setting lport c2c5a73f-18c6-4300-86ed-9f2441645bb3 up in Southbound
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.072 2 INFO nova.compute.manager [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Post operation of migration started
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.073 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.173 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.173 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.259 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-4901d3e6-68ac-4c50-9462-ba7192c80bf4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.259 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-4901d3e6-68ac-4c50-9462-ba7192c80bf4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.260 2 DEBUG nova.network.neutron [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:04:06 compute-0 nova_compute[192716]: 2025-10-07 22:04:06.766 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:07 compute-0 nova_compute[192716]: 2025-10-07 22:04:07.864 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:07 compute-0 nova_compute[192716]: 2025-10-07 22:04:07.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:08 compute-0 nova_compute[192716]: 2025-10-07 22:04:08.059 2 DEBUG nova.network.neutron [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Updating instance_info_cache with network_info: [{"id": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "address": "fa:16:3e:bb:8f:4b", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c5a73f-18", "ovs_interfaceid": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:04:08 compute-0 nova_compute[192716]: 2025-10-07 22:04:08.567 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-4901d3e6-68ac-4c50-9462-ba7192c80bf4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:04:08 compute-0 podman[221034]: 2025-10-07 22:04:08.877788125 +0000 UTC m=+0.108847580 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251007)
Oct 07 22:04:09 compute-0 nova_compute[192716]: 2025-10-07 22:04:09.087 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:09 compute-0 nova_compute[192716]: 2025-10-07 22:04:09.088 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:09 compute-0 nova_compute[192716]: 2025-10-07 22:04:09.088 2 DEBUG oslo_concurrency.lockutils [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:09 compute-0 nova_compute[192716]: 2025-10-07 22:04:09.093 2 INFO nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:04:09 compute-0 virtqemud[192532]: Domain id=11 name='instance-0000000e' uuid=4901d3e6-68ac-4c50-9462-ba7192c80bf4 is tainted: custom-monitor
Oct 07 22:04:09 compute-0 nova_compute[192716]: 2025-10-07 22:04:09.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:10 compute-0 nova_compute[192716]: 2025-10-07 22:04:10.102 2 INFO nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:04:11 compute-0 nova_compute[192716]: 2025-10-07 22:04:11.108 2 INFO nova.virt.libvirt.driver [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:04:11 compute-0 nova_compute[192716]: 2025-10-07 22:04:11.113 2 DEBUG nova.compute.manager [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:04:11 compute-0 nova_compute[192716]: 2025-10-07 22:04:11.626 2 DEBUG nova.objects.instance [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:04:11 compute-0 podman[221062]: 2025-10-07 22:04:11.8275104 +0000 UTC m=+0.060367247 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:04:12 compute-0 nova_compute[192716]: 2025-10-07 22:04:12.646 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:12 compute-0 nova_compute[192716]: 2025-10-07 22:04:12.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:13 compute-0 nova_compute[192716]: 2025-10-07 22:04:13.017 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:13 compute-0 nova_compute[192716]: 2025-10-07 22:04:13.018 2 WARNING neutronclient.v2_0.client [None req-7d866f40-5373-4227-adb7-be1a9d54584b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:14 compute-0 nova_compute[192716]: 2025-10-07 22:04:14.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:14 compute-0 podman[221081]: 2025-10-07 22:04:14.81683454 +0000 UTC m=+0.064399144 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=edpm, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.193 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.194 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.194 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.194 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.195 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.213 2 INFO nova.compute.manager [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Terminating instance
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.728 2 DEBUG nova.compute.manager [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:04:17 compute-0 kernel: tap3fa2f6fa-52 (unregistering): left promiscuous mode
Oct 07 22:04:17 compute-0 NetworkManager[51722]: <info>  [1759874657.7564] device (tap3fa2f6fa-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:04:17 compute-0 ovn_controller[94904]: 2025-10-07T22:04:17Z|00147|binding|INFO|Releasing lport 3fa2f6fa-5235-40b1-95d4-a5750a801212 from this chassis (sb_readonly=0)
Oct 07 22:04:17 compute-0 ovn_controller[94904]: 2025-10-07T22:04:17Z|00148|binding|INFO|Setting lport 3fa2f6fa-5235-40b1-95d4-a5750a801212 down in Southbound
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:17 compute-0 ovn_controller[94904]: 2025-10-07T22:04:17Z|00149|binding|INFO|Removing iface tap3fa2f6fa-52 ovn-installed in OVS
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.780 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:7b:28 10.100.0.10'], port_security=['fa:16:3e:7a:7b:28 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd237b0bb-037e-4864-9f2b-3cd5343c9b1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=3fa2f6fa-5235-40b1-95d4-a5750a801212) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.781 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 3fa2f6fa-5235-40b1-95d4-a5750a801212 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.783 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726154fe-bda6-431d-b983-7caa973a9e17
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.813 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a1337c5a-3bcf-4e48-a263-41ea38d5463d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:17 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Oct 07 22:04:17 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Consumed 14.173s CPU time.
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.862 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2967c0b9-deaf-4f2f-9aaf-2dda73e685b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:17 compute-0 systemd-machined[152719]: Machine qemu-10-instance-0000000f terminated.
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.867 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[76493179-d85c-422d-8704-f624f59892c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.910 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[10f65fe0-41c8-458d-a7e5-cfc2dae1fda0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.938 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[371d13a1-c5d8-4565-bd10-a453b63d82ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726154fe-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:b2:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433810, 'reachable_time': 40915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221115, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.968 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cae9f0-9b14-43c2-b9b0-fc0f053dbad4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433824, 'tstamp': 433824}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221117, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap726154fe-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433828, 'tstamp': 433828}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221117, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.970 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:17 compute-0 nova_compute[192716]: 2025-10-07 22:04:17.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.988 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726154fe-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.989 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.989 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726154fe-b0, col_values=(('external_ids', {'iface-id': 'b6dfddd4-019f-4508-ab9b-37759605366f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.990 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:04:17 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:17.992 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e472e8db-b9af-4187-bcb6-82abaecbaa06]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-726154fe-bda6-431d-b983-7caa973a9e17\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 726154fe-bda6-431d-b983-7caa973a9e17\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.029 2 DEBUG nova.compute.manager [req-ec43790a-5570-4d53-9ec2-9ba5c6a67969 req-2820b4bc-8bc4-40b8-8b96-3f7f74526776 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-unplugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.029 2 DEBUG oslo_concurrency.lockutils [req-ec43790a-5570-4d53-9ec2-9ba5c6a67969 req-2820b4bc-8bc4-40b8-8b96-3f7f74526776 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.029 2 DEBUG oslo_concurrency.lockutils [req-ec43790a-5570-4d53-9ec2-9ba5c6a67969 req-2820b4bc-8bc4-40b8-8b96-3f7f74526776 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.030 2 DEBUG oslo_concurrency.lockutils [req-ec43790a-5570-4d53-9ec2-9ba5c6a67969 req-2820b4bc-8bc4-40b8-8b96-3f7f74526776 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.030 2 DEBUG nova.compute.manager [req-ec43790a-5570-4d53-9ec2-9ba5c6a67969 req-2820b4bc-8bc4-40b8-8b96-3f7f74526776 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] No waiting events found dispatching network-vif-unplugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.030 2 DEBUG nova.compute.manager [req-ec43790a-5570-4d53-9ec2-9ba5c6a67969 req-2820b4bc-8bc4-40b8-8b96-3f7f74526776 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-unplugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.032 2 INFO nova.virt.libvirt.driver [-] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Instance destroyed successfully.
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.033 2 DEBUG nova.objects.instance [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lazy-loading 'resources' on Instance uuid d237b0bb-037e-4864-9f2b-3cd5343c9b1a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.543 2 DEBUG nova.virt.libvirt.vif [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:03:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-156515144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-156515144',id=15,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:03:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-vaeio20y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:03:26Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=d237b0bb-037e-4864-9f2b-3cd5343c9b1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.543 2 DEBUG nova.network.os_vif_util [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "address": "fa:16:3e:7a:7b:28", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa2f6fa-52", "ovs_interfaceid": "3fa2f6fa-5235-40b1-95d4-a5750a801212", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.544 2 DEBUG nova.network.os_vif_util [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.545 2 DEBUG os_vif [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3fa2f6fa-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.553 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c859c2c9-5762-4262-b94a-b95045b21884) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.558 2 INFO os_vif [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:7b:28,bridge_name='br-int',has_traffic_filtering=True,id=3fa2f6fa-5235-40b1-95d4-a5750a801212,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa2f6fa-52')
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.559 2 INFO nova.virt.libvirt.driver [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Deleting instance files /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a_del
Oct 07 22:04:18 compute-0 nova_compute[192716]: 2025-10-07 22:04:18.560 2 INFO nova.virt.libvirt.driver [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Deletion of /var/lib/nova/instances/d237b0bb-037e-4864-9f2b-3cd5343c9b1a_del complete
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.075 2 INFO nova.compute.manager [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Took 1.35 seconds to destroy the instance on the hypervisor.
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.076 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.076 2 DEBUG nova.compute.manager [-] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.076 2 DEBUG nova.network.neutron [-] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.077 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:19 compute-0 nova_compute[192716]: 2025-10-07 22:04:19.998 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.120 2 DEBUG nova.compute.manager [req-e8063f4c-f159-4f20-92c9-219c39651b71 req-7679da3e-e48b-4eb2-8a6c-d2ec758954f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-unplugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.121 2 DEBUG oslo_concurrency.lockutils [req-e8063f4c-f159-4f20-92c9-219c39651b71 req-7679da3e-e48b-4eb2-8a6c-d2ec758954f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.122 2 DEBUG oslo_concurrency.lockutils [req-e8063f4c-f159-4f20-92c9-219c39651b71 req-7679da3e-e48b-4eb2-8a6c-d2ec758954f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.122 2 DEBUG oslo_concurrency.lockutils [req-e8063f4c-f159-4f20-92c9-219c39651b71 req-7679da3e-e48b-4eb2-8a6c-d2ec758954f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.123 2 DEBUG nova.compute.manager [req-e8063f4c-f159-4f20-92c9-219c39651b71 req-7679da3e-e48b-4eb2-8a6c-d2ec758954f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] No waiting events found dispatching network-vif-unplugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.123 2 DEBUG nova.compute.manager [req-e8063f4c-f159-4f20-92c9-219c39651b71 req-7679da3e-e48b-4eb2-8a6c-d2ec758954f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-unplugged-3fa2f6fa-5235-40b1-95d4-a5750a801212 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.475 2 DEBUG nova.compute.manager [req-c5588cdc-85c8-43ce-871f-814f94b99bbf req-c5cc7f60-e00e-43d7-b22d-00397190f325 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Received event network-vif-deleted-3fa2f6fa-5235-40b1-95d4-a5750a801212 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.476 2 INFO nova.compute.manager [req-c5588cdc-85c8-43ce-871f-814f94b99bbf req-c5cc7f60-e00e-43d7-b22d-00397190f325 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Neutron deleted interface 3fa2f6fa-5235-40b1-95d4-a5750a801212; detaching it from the instance and deleting it from the info cache
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.476 2 DEBUG nova.network.neutron [req-c5588cdc-85c8-43ce-871f-814f94b99bbf req-c5cc7f60-e00e-43d7-b22d-00397190f325 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.919 2 DEBUG nova.network.neutron [-] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:04:20 compute-0 nova_compute[192716]: 2025-10-07 22:04:20.984 2 DEBUG nova.compute.manager [req-c5588cdc-85c8-43ce-871f-814f94b99bbf req-c5cc7f60-e00e-43d7-b22d-00397190f325 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Detach interface failed, port_id=3fa2f6fa-5235-40b1-95d4-a5750a801212, reason: Instance d237b0bb-037e-4864-9f2b-3cd5343c9b1a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:04:21 compute-0 nova_compute[192716]: 2025-10-07 22:04:21.427 2 INFO nova.compute.manager [-] [instance: d237b0bb-037e-4864-9f2b-3cd5343c9b1a] Took 2.35 seconds to deallocate network for instance.
Oct 07 22:04:21 compute-0 nova_compute[192716]: 2025-10-07 22:04:21.958 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:21 compute-0 nova_compute[192716]: 2025-10-07 22:04:21.958 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:22 compute-0 nova_compute[192716]: 2025-10-07 22:04:22.021 2 DEBUG nova.compute.provider_tree [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:04:23 compute-0 nova_compute[192716]: 2025-10-07 22:04:23.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:24 compute-0 nova_compute[192716]: 2025-10-07 22:04:24.813 2 DEBUG nova.scheduler.client.report [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:04:24 compute-0 nova_compute[192716]: 2025-10-07 22:04:24.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:25 compute-0 nova_compute[192716]: 2025-10-07 22:04:25.323 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 3.365s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:25 compute-0 nova_compute[192716]: 2025-10-07 22:04:25.392 2 INFO nova.scheduler.client.report [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Deleted allocations for instance d237b0bb-037e-4864-9f2b-3cd5343c9b1a
Oct 07 22:04:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:25.630 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:25.630 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:25.631 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:26 compute-0 nova_compute[192716]: 2025-10-07 22:04:26.437 2 DEBUG oslo_concurrency.lockutils [None req-6c91c7f0-e859-44ff-a70e-45683f392866 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "d237b0bb-037e-4864-9f2b-3cd5343c9b1a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.243s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:26 compute-0 nova_compute[192716]: 2025-10-07 22:04:26.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.162 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.162 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.163 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.163 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.164 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.179 2 INFO nova.compute.manager [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Terminating instance
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.699 2 DEBUG nova.compute.manager [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:04:27 compute-0 kernel: tapc2c5a73f-18 (unregistering): left promiscuous mode
Oct 07 22:04:27 compute-0 NetworkManager[51722]: <info>  [1759874667.7378] device (tapc2c5a73f-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:27 compute-0 ovn_controller[94904]: 2025-10-07T22:04:27Z|00150|binding|INFO|Releasing lport c2c5a73f-18c6-4300-86ed-9f2441645bb3 from this chassis (sb_readonly=0)
Oct 07 22:04:27 compute-0 ovn_controller[94904]: 2025-10-07T22:04:27Z|00151|binding|INFO|Setting lport c2c5a73f-18c6-4300-86ed-9f2441645bb3 down in Southbound
Oct 07 22:04:27 compute-0 ovn_controller[94904]: 2025-10-07T22:04:27Z|00152|binding|INFO|Removing iface tapc2c5a73f-18 ovn-installed in OVS
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:27.761 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:8f:4b 10.100.0.9'], port_security=['fa:16:3e:bb:8f:4b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4901d3e6-68ac-4c50-9462-ba7192c80bf4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726154fe-bda6-431d-b983-7caa973a9e17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ad27d63f39845acba6b21828806b82a', 'neutron:revision_number': '15', 'neutron:security_group_ids': '7085a98e-3cea-46c4-a04e-730e3c566bcd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4057f821-2b26-4a21-8644-5757b0f352fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=c2c5a73f-18c6-4300-86ed-9f2441645bb3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:04:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:27.762 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c2c5a73f-18c6-4300-86ed-9f2441645bb3 in datapath 726154fe-bda6-431d-b983-7caa973a9e17 unbound from our chassis
Oct 07 22:04:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:27.764 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 726154fe-bda6-431d-b983-7caa973a9e17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:04:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:27.765 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fd39f7b9-0162-4221-89aa-18887b0ca5fb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:27.765 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 namespace which is not needed anymore
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:27 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Oct 07 22:04:27 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Consumed 2.368s CPU time.
Oct 07 22:04:27 compute-0 systemd-machined[152719]: Machine qemu-11-instance-0000000e terminated.
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:27 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [NOTICE]   (220739) : haproxy version is 3.0.5-8e879a5
Oct 07 22:04:27 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [NOTICE]   (220739) : path to executable is /usr/sbin/haproxy
Oct 07 22:04:27 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [WARNING]  (220739) : Exiting Master process...
Oct 07 22:04:27 compute-0 podman[221161]: 2025-10-07 22:04:27.968970261 +0000 UTC m=+0.052529801 container kill d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 07 22:04:27 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [ALERT]    (220739) : Current worker (220741) exited with code 143 (Terminated)
Oct 07 22:04:27 compute-0 neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17[220735]: [WARNING]  (220739) : All workers exited. Exiting... (0)
Oct 07 22:04:27 compute-0 systemd[1]: libpod-d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba.scope: Deactivated successfully.
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.976 2 INFO nova.virt.libvirt.driver [-] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Instance destroyed successfully.
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.976 2 DEBUG nova.objects.instance [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lazy-loading 'resources' on Instance uuid 4901d3e6-68ac-4c50-9462-ba7192c80bf4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.988 2 DEBUG nova.compute.manager [req-d6f2d00d-558b-4875-bd89-3953945c6156 req-7125b945-98a8-4588-86bc-9f00060c1e78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Received event network-vif-unplugged-c2c5a73f-18c6-4300-86ed-9f2441645bb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.989 2 DEBUG oslo_concurrency.lockutils [req-d6f2d00d-558b-4875-bd89-3953945c6156 req-7125b945-98a8-4588-86bc-9f00060c1e78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.989 2 DEBUG oslo_concurrency.lockutils [req-d6f2d00d-558b-4875-bd89-3953945c6156 req-7125b945-98a8-4588-86bc-9f00060c1e78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.989 2 DEBUG oslo_concurrency.lockutils [req-d6f2d00d-558b-4875-bd89-3953945c6156 req-7125b945-98a8-4588-86bc-9f00060c1e78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.990 2 DEBUG nova.compute.manager [req-d6f2d00d-558b-4875-bd89-3953945c6156 req-7125b945-98a8-4588-86bc-9f00060c1e78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] No waiting events found dispatching network-vif-unplugged-c2c5a73f-18c6-4300-86ed-9f2441645bb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:04:27 compute-0 nova_compute[192716]: 2025-10-07 22:04:27.990 2 DEBUG nova.compute.manager [req-d6f2d00d-558b-4875-bd89-3953945c6156 req-7125b945-98a8-4588-86bc-9f00060c1e78 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Received event network-vif-unplugged-c2c5a73f-18c6-4300-86ed-9f2441645bb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:04:28 compute-0 podman[221191]: 2025-10-07 22:04:28.029368668 +0000 UTC m=+0.033291434 container died d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 07 22:04:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba-userdata-shm.mount: Deactivated successfully.
Oct 07 22:04:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-425525147de1bd309e2ffcbf4b6e919280d243c81ab8da590014e3a4513f993f-merged.mount: Deactivated successfully.
Oct 07 22:04:28 compute-0 podman[221191]: 2025-10-07 22:04:28.080723304 +0000 UTC m=+0.084646070 container cleanup d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 22:04:28 compute-0 systemd[1]: libpod-conmon-d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba.scope: Deactivated successfully.
Oct 07 22:04:28 compute-0 podman[221193]: 2025-10-07 22:04:28.103266566 +0000 UTC m=+0.096778401 container remove d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.111 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef70617-377a-4182-89d4-2b3c86f73a6a]: (4, ("Tue Oct  7 10:04:27 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 (d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba)\nd449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba\nTue Oct  7 10:04:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 (d449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba)\nd449cfdac63c80e0349038a77d1427e061819b163df7a20d97ebee0f9a3933ba\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.112 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[86c5b877-83c1-49fd-b426-2e40f7feb2ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.113 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726154fe-bda6-431d-b983-7caa973a9e17.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.113 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[631ea845-6146-41b7-98f2-13c4c0b4376c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.114 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726154fe-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:28 compute-0 kernel: tap726154fe-b0: left promiscuous mode
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.149 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[df4007ab-4dd8-4f02-aa16-5ed7b91a4313]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.185 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e17ed2a8-ea07-48c0-a45c-80e5e1c37583]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.186 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[49d7e692-c862-422e-93de-b9c81ef7009b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.206 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[18563404-0d5b-4ab0-822c-0cc3f604898c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433802, 'reachable_time': 16477, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221225, 'error': None, 'target': 'ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.208 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-726154fe-bda6-431d-b983-7caa973a9e17 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:04:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:28.209 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[272fb698-27be-4ef7-8b95-570f78e178fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d726154fe\x2dbda6\x2d431d\x2db983\x2d7caa973a9e17.mount: Deactivated successfully.
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.488 2 DEBUG nova.virt.libvirt.vif [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:02:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1943482815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1943482815',id=14,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:03:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ad27d63f39845acba6b21828806b82a',ramdisk_id='',reservation_id='r-5b04ewyi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-152687663',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-152687663-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:04:12Z,user_data=None,user_id='db99335261504aa7b84c7d30ec17d679',uuid=4901d3e6-68ac-4c50-9462-ba7192c80bf4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "address": "fa:16:3e:bb:8f:4b", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c5a73f-18", "ovs_interfaceid": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.489 2 DEBUG nova.network.os_vif_util [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converting VIF {"id": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "address": "fa:16:3e:bb:8f:4b", "network": {"id": "726154fe-bda6-431d-b983-7caa973a9e17", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-468758744-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df550d234d364e7fb20f9ac88392be8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c5a73f-18", "ovs_interfaceid": "c2c5a73f-18c6-4300-86ed-9f2441645bb3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.490 2 DEBUG nova.network.os_vif_util [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bb:8f:4b,bridge_name='br-int',has_traffic_filtering=True,id=c2c5a73f-18c6-4300-86ed-9f2441645bb3,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c5a73f-18') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.491 2 DEBUG os_vif [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:8f:4b,bridge_name='br-int',has_traffic_filtering=True,id=c2c5a73f-18c6-4300-86ed-9f2441645bb3,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c5a73f-18') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2c5a73f-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.501 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=8dc1aa47-defb-4363-bddc-4919468d87d0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.507 2 INFO os_vif [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:8f:4b,bridge_name='br-int',has_traffic_filtering=True,id=c2c5a73f-18c6-4300-86ed-9f2441645bb3,network=Network(726154fe-bda6-431d-b983-7caa973a9e17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c5a73f-18')
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.507 2 INFO nova.virt.libvirt.driver [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Deleting instance files /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4_del
Oct 07 22:04:28 compute-0 nova_compute[192716]: 2025-10-07 22:04:28.508 2 INFO nova.virt.libvirt.driver [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Deletion of /var/lib/nova/instances/4901d3e6-68ac-4c50-9462-ba7192c80bf4_del complete
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.023 2 INFO nova.compute.manager [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.023 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.024 2 DEBUG nova.compute.manager [-] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.024 2 DEBUG nova.network.neutron [-] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.024 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:29 compute-0 podman[203153]: time="2025-10-07T22:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:04:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:04:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 07 22:04:29 compute-0 podman[221226]: 2025-10-07 22:04:29.854837339 +0000 UTC m=+0.085928487 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:29 compute-0 podman[221227]: 2025-10-07 22:04:29.86940475 +0000 UTC m=+0.101214529 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:29 compute-0 nova_compute[192716]: 2025-10-07 22:04:29.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.009 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.057 2 DEBUG nova.compute.manager [req-11350d9a-d022-471a-8071-e5a6df53a7a7 req-464d8181-6ca2-4c70-ab76-e8e0a5088e96 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Received event network-vif-unplugged-c2c5a73f-18c6-4300-86ed-9f2441645bb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.058 2 DEBUG oslo_concurrency.lockutils [req-11350d9a-d022-471a-8071-e5a6df53a7a7 req-464d8181-6ca2-4c70-ab76-e8e0a5088e96 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.058 2 DEBUG oslo_concurrency.lockutils [req-11350d9a-d022-471a-8071-e5a6df53a7a7 req-464d8181-6ca2-4c70-ab76-e8e0a5088e96 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.059 2 DEBUG oslo_concurrency.lockutils [req-11350d9a-d022-471a-8071-e5a6df53a7a7 req-464d8181-6ca2-4c70-ab76-e8e0a5088e96 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.059 2 DEBUG nova.compute.manager [req-11350d9a-d022-471a-8071-e5a6df53a7a7 req-464d8181-6ca2-4c70-ab76-e8e0a5088e96 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] No waiting events found dispatching network-vif-unplugged-c2c5a73f-18c6-4300-86ed-9f2441645bb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:04:30 compute-0 nova_compute[192716]: 2025-10-07 22:04:30.059 2 DEBUG nova.compute.manager [req-11350d9a-d022-471a-8071-e5a6df53a7a7 req-464d8181-6ca2-4c70-ab76-e8e0a5088e96 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Received event network-vif-unplugged-c2c5a73f-18c6-4300-86ed-9f2441645bb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:04:31 compute-0 nova_compute[192716]: 2025-10-07 22:04:31.152 2 DEBUG nova.compute.manager [req-72363eea-8022-410f-882c-3222a044d128 req-48373bc1-9870-4b68-a10b-8de38a614acf 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Received event network-vif-deleted-c2c5a73f-18c6-4300-86ed-9f2441645bb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:04:31 compute-0 nova_compute[192716]: 2025-10-07 22:04:31.153 2 INFO nova.compute.manager [req-72363eea-8022-410f-882c-3222a044d128 req-48373bc1-9870-4b68-a10b-8de38a614acf 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Neutron deleted interface c2c5a73f-18c6-4300-86ed-9f2441645bb3; detaching it from the instance and deleting it from the info cache
Oct 07 22:04:31 compute-0 nova_compute[192716]: 2025-10-07 22:04:31.154 2 DEBUG nova.network.neutron [req-72363eea-8022-410f-882c-3222a044d128 req-48373bc1-9870-4b68-a10b-8de38a614acf 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: ERROR   22:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: ERROR   22:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: ERROR   22:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: ERROR   22:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: ERROR   22:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:04:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:04:31 compute-0 nova_compute[192716]: 2025-10-07 22:04:31.562 2 DEBUG nova.network.neutron [-] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:04:31 compute-0 nova_compute[192716]: 2025-10-07 22:04:31.662 2 DEBUG nova.compute.manager [req-72363eea-8022-410f-882c-3222a044d128 req-48373bc1-9870-4b68-a10b-8de38a614acf 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Detach interface failed, port_id=c2c5a73f-18c6-4300-86ed-9f2441645bb3, reason: Instance 4901d3e6-68ac-4c50-9462-ba7192c80bf4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:04:32 compute-0 nova_compute[192716]: 2025-10-07 22:04:32.069 2 INFO nova.compute.manager [-] [instance: 4901d3e6-68ac-4c50-9462-ba7192c80bf4] Took 3.05 seconds to deallocate network for instance.
Oct 07 22:04:32 compute-0 nova_compute[192716]: 2025-10-07 22:04:32.615 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:32 compute-0 nova_compute[192716]: 2025-10-07 22:04:32.616 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:32 compute-0 nova_compute[192716]: 2025-10-07 22:04:32.638 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.022s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:32 compute-0 nova_compute[192716]: 2025-10-07 22:04:32.684 2 INFO nova.scheduler.client.report [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Deleted allocations for instance 4901d3e6-68ac-4c50-9462-ba7192c80bf4
Oct 07 22:04:32 compute-0 podman[221267]: 2025-10-07 22:04:32.816458357 +0000 UTC m=+0.059445341 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:04:33 compute-0 nova_compute[192716]: 2025-10-07 22:04:33.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:33 compute-0 nova_compute[192716]: 2025-10-07 22:04:33.715 2 DEBUG oslo_concurrency.lockutils [None req-aa298a0f-cb84-402e-b210-91309fced6a9 db99335261504aa7b84c7d30ec17d679 3ad27d63f39845acba6b21828806b82a - - default default] Lock "4901d3e6-68ac-4c50-9462-ba7192c80bf4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.553s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:33 compute-0 nova_compute[192716]: 2025-10-07 22:04:33.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:33 compute-0 nova_compute[192716]: 2025-10-07 22:04:33.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:34 compute-0 nova_compute[192716]: 2025-10-07 22:04:34.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:35 compute-0 nova_compute[192716]: 2025-10-07 22:04:35.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.504 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.726 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.728 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.760 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.761 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5871MB free_disk=73.303466796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.762 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:04:36 compute-0 nova_compute[192716]: 2025-10-07 22:04:36.762 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:04:37 compute-0 nova_compute[192716]: 2025-10-07 22:04:37.832 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:04:37 compute-0 nova_compute[192716]: 2025-10-07 22:04:37.832 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:04:36 up  1:13,  0 user,  load average: 0.27, 0.22, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:04:37 compute-0 nova_compute[192716]: 2025-10-07 22:04:37.859 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:04:38 compute-0 nova_compute[192716]: 2025-10-07 22:04:38.367 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:04:38 compute-0 nova_compute[192716]: 2025-10-07 22:04:38.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:38 compute-0 nova_compute[192716]: 2025-10-07 22:04:38.876 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:04:38 compute-0 nova_compute[192716]: 2025-10-07 22:04:38.877 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:04:39 compute-0 nova_compute[192716]: 2025-10-07 22:04:39.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:39 compute-0 nova_compute[192716]: 2025-10-07 22:04:39.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:39 compute-0 nova_compute[192716]: 2025-10-07 22:04:39.877 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:39 compute-0 nova_compute[192716]: 2025-10-07 22:04:39.877 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:39 compute-0 podman[221293]: 2025-10-07 22:04:39.912234014 +0000 UTC m=+0.143150073 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Oct 07 22:04:39 compute-0 nova_compute[192716]: 2025-10-07 22:04:39.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:04:42 compute-0 podman[221321]: 2025-10-07 22:04:42.826942855 +0000 UTC m=+0.068776990 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 22:04:43 compute-0 nova_compute[192716]: 2025-10-07 22:04:43.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:44 compute-0 nova_compute[192716]: 2025-10-07 22:04:44.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:45 compute-0 podman[221340]: 2025-10-07 22:04:45.841288779 +0000 UTC m=+0.086908055 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 22:04:48 compute-0 nova_compute[192716]: 2025-10-07 22:04:48.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:49 compute-0 nova_compute[192716]: 2025-10-07 22:04:49.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:52.232 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:d3:d5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ddfd0140eaa4d5e8d43efda963767d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6865dbad-0588-4cfd-9a22-08a49ea1d5a5) old=Port_Binding(mac=['fa:16:3e:a7:d3:d5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ddfd0140eaa4d5e8d43efda963767d1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:04:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:52.233 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6865dbad-0588-4cfd-9a22-08a49ea1d5a5 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 updated
Oct 07 22:04:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:52.234 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f17307e-ac72-4a6f-8a05-ba2eca705379, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:04:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:04:52.235 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf570bd-baaf-4549-bb15-5b739ece7402]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:04:53 compute-0 nova_compute[192716]: 2025-10-07 22:04:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:54 compute-0 nova_compute[192716]: 2025-10-07 22:04:54.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:58 compute-0 nova_compute[192716]: 2025-10-07 22:04:58.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:04:59 compute-0 podman[203153]: time="2025-10-07T22:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:04:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:04:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 07 22:04:59 compute-0 nova_compute[192716]: 2025-10-07 22:04:59.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:00.150 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:68:fb 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-5b645e4c-4ce5-4b02-bbf1-9694b669180e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b645e4c-4ce5-4b02-bbf1-9694b669180e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c549065-0811-4e2b-9cbe-c46f7e62229d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c4a0e56f-9a2c-43e8-9518-2c54d14988e9) old=Port_Binding(mac=['fa:16:3e:7c:68:fb'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-5b645e4c-4ce5-4b02-bbf1-9694b669180e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b645e4c-4ce5-4b02-bbf1-9694b669180e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:05:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:00.151 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c4a0e56f-9a2c-43e8-9518-2c54d14988e9 in datapath 5b645e4c-4ce5-4b02-bbf1-9694b669180e updated
Oct 07 22:05:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:00.153 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b645e4c-4ce5-4b02-bbf1-9694b669180e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:05:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:00.154 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3949e238-41f5-4d3d-9dce-88f66b351a08]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:00 compute-0 podman[221364]: 2025-10-07 22:05:00.856627969 +0000 UTC m=+0.087648267 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:05:00 compute-0 podman[221363]: 2025-10-07 22:05:00.856628579 +0000 UTC m=+0.084277659 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:05:01 compute-0 nova_compute[192716]: 2025-10-07 22:05:01.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:01 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:01.038 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:05:01 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:01.039 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: ERROR   22:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: ERROR   22:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: ERROR   22:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: ERROR   22:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: ERROR   22:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:05:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:05:03 compute-0 nova_compute[192716]: 2025-10-07 22:05:03.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:03 compute-0 podman[221404]: 2025-10-07 22:05:03.833944951 +0000 UTC m=+0.064419735 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:05:04 compute-0 nova_compute[192716]: 2025-10-07 22:05:04.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:08 compute-0 nova_compute[192716]: 2025-10-07 22:05:08.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:09 compute-0 nova_compute[192716]: 2025-10-07 22:05:09.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:10 compute-0 podman[221428]: 2025-10-07 22:05:10.898970249 +0000 UTC m=+0.133839983 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:05:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:11.041 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:13 compute-0 nova_compute[192716]: 2025-10-07 22:05:13.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:13 compute-0 podman[221454]: 2025-10-07 22:05:13.844666568 +0000 UTC m=+0.079760459 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007)
Oct 07 22:05:14 compute-0 ovn_controller[94904]: 2025-10-07T22:05:14Z|00153|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 07 22:05:14 compute-0 nova_compute[192716]: 2025-10-07 22:05:14.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:16 compute-0 podman[221474]: 2025-10-07 22:05:16.85411465 +0000 UTC m=+0.083143346 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=)
Oct 07 22:05:18 compute-0 nova_compute[192716]: 2025-10-07 22:05:18.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:19 compute-0 nova_compute[192716]: 2025-10-07 22:05:19.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:23 compute-0 nova_compute[192716]: 2025-10-07 22:05:23.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:24 compute-0 nova_compute[192716]: 2025-10-07 22:05:24.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:25.632 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:25.632 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:25.632 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:26 compute-0 nova_compute[192716]: 2025-10-07 22:05:26.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:28 compute-0 nova_compute[192716]: 2025-10-07 22:05:28.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:29 compute-0 podman[203153]: time="2025-10-07T22:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:05:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:05:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 07 22:05:29 compute-0 nova_compute[192716]: 2025-10-07 22:05:29.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: ERROR   22:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: ERROR   22:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: ERROR   22:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: ERROR   22:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: ERROR   22:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:05:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:05:31 compute-0 podman[221495]: 2025-10-07 22:05:31.845368133 +0000 UTC m=+0.079184212 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 22:05:31 compute-0 podman[221496]: 2025-10-07 22:05:31.845646861 +0000 UTC m=+0.075148695 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 07 22:05:31 compute-0 nova_compute[192716]: 2025-10-07 22:05:31.875 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:31 compute-0 nova_compute[192716]: 2025-10-07 22:05:31.876 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:31 compute-0 nova_compute[192716]: 2025-10-07 22:05:31.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:31 compute-0 nova_compute[192716]: 2025-10-07 22:05:31.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:05:32 compute-0 nova_compute[192716]: 2025-10-07 22:05:32.383 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 22:05:32 compute-0 nova_compute[192716]: 2025-10-07 22:05:32.952 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:32 compute-0 nova_compute[192716]: 2025-10-07 22:05:32.953 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:32 compute-0 nova_compute[192716]: 2025-10-07 22:05:32.960 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 22:05:32 compute-0 nova_compute[192716]: 2025-10-07 22:05:32.960 2 INFO nova.compute.claims [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Claim successful on node compute-0.ctlplane.example.com
Oct 07 22:05:33 compute-0 nova_compute[192716]: 2025-10-07 22:05:33.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:34 compute-0 nova_compute[192716]: 2025-10-07 22:05:34.032 2 DEBUG nova.compute.provider_tree [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:05:34 compute-0 nova_compute[192716]: 2025-10-07 22:05:34.542 2 DEBUG nova.scheduler.client.report [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:05:34 compute-0 podman[221534]: 2025-10-07 22:05:34.84423973 +0000 UTC m=+0.078015738 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:05:34 compute-0 nova_compute[192716]: 2025-10-07 22:05:34.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:34 compute-0 nova_compute[192716]: 2025-10-07 22:05:34.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.054 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.055 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.567 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.567 2 DEBUG nova.network.neutron [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.568 2 WARNING neutronclient.v2_0.client [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.569 2 WARNING neutronclient.v2_0.client [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:35 compute-0 nova_compute[192716]: 2025-10-07 22:05:35.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.078 2 INFO nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.220 2 DEBUG nova.network.neutron [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Successfully created port: 02ca2353-7f96-4a92-ad90-875a2cf33c00 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.503 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.588 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.739 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.741 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.782 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.783 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5863MB free_disk=73.303466796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.783 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:36 compute-0 nova_compute[192716]: 2025-10-07 22:05:36.783 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.349 2 DEBUG nova.network.neutron [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Successfully updated port: 02ca2353-7f96-4a92-ad90-875a2cf33c00 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.427 2 DEBUG nova.compute.manager [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-changed-02ca2353-7f96-4a92-ad90-875a2cf33c00 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.428 2 DEBUG nova.compute.manager [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Refreshing instance network info cache due to event network-changed-02ca2353-7f96-4a92-ad90-875a2cf33c00. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.428 2 DEBUG oslo_concurrency.lockutils [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-598c5a67-7d06-4cd6-a149-2137b69c64f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.428 2 DEBUG oslo_concurrency.lockutils [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-598c5a67-7d06-4cd6-a149-2137b69c64f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.429 2 DEBUG nova.network.neutron [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Refreshing network info cache for port 02ca2353-7f96-4a92-ad90-875a2cf33c00 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.611 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.613 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.614 2 INFO nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Creating image(s)
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.615 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "/var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.615 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "/var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.616 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "/var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.616 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.621 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.623 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.705 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.706 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.708 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.708 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.714 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.715 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.809 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.810 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.832 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 598c5a67-7d06-4cd6-a149-2137b69c64f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.832 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.833 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:05:36 up  1:14,  0 user,  load average: 0.10, 0.18, 0.27\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_block_device_mapping': '1', 'num_os_type_None': '1', 'num_proj_faa7f94deef04b67982eaf47a775c225': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.850 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.851 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.851 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.864 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "refresh_cache-598c5a67-7d06-4cd6-a149-2137b69c64f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.887 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.909 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.909 2 DEBUG nova.virt.disk.api [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Checking if we can resize image /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.910 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.936 2 WARNING neutronclient.v2_0.client [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.966 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.967 2 DEBUG nova.virt.disk.api [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Cannot resize image /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.967 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.968 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Ensure instance console log exists: /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.969 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.969 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:37 compute-0 nova_compute[192716]: 2025-10-07 22:05:37.970 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:38 compute-0 nova_compute[192716]: 2025-10-07 22:05:38.397 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:05:38 compute-0 nova_compute[192716]: 2025-10-07 22:05:38.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:38 compute-0 nova_compute[192716]: 2025-10-07 22:05:38.745 2 DEBUG nova.network.neutron [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:05:38 compute-0 nova_compute[192716]: 2025-10-07 22:05:38.911 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:05:38 compute-0 nova_compute[192716]: 2025-10-07 22:05:38.912 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.129s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:38 compute-0 nova_compute[192716]: 2025-10-07 22:05:38.952 2 DEBUG nova.network.neutron [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:05:39 compute-0 nova_compute[192716]: 2025-10-07 22:05:39.459 2 DEBUG oslo_concurrency.lockutils [req-e202e4a5-6b63-4bb9-bd6a-935c2c081316 req-cbf3d529-9bc7-4375-a429-d40e785ac0a7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-598c5a67-7d06-4cd6-a149-2137b69c64f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:05:39 compute-0 nova_compute[192716]: 2025-10-07 22:05:39.461 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquired lock "refresh_cache-598c5a67-7d06-4cd6-a149-2137b69c64f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:05:39 compute-0 nova_compute[192716]: 2025-10-07 22:05:39.462 2 DEBUG nova.network.neutron [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:05:39 compute-0 nova_compute[192716]: 2025-10-07 22:05:39.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:41 compute-0 nova_compute[192716]: 2025-10-07 22:05:41.042 2 DEBUG nova.network.neutron [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:05:41 compute-0 podman[221574]: 2025-10-07 22:05:41.895485029 +0000 UTC m=+0.122739361 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251007)
Oct 07 22:05:41 compute-0 nova_compute[192716]: 2025-10-07 22:05:41.913 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:42 compute-0 nova_compute[192716]: 2025-10-07 22:05:42.423 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:42 compute-0 nova_compute[192716]: 2025-10-07 22:05:42.424 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:42 compute-0 nova_compute[192716]: 2025-10-07 22:05:42.424 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:05:43 compute-0 nova_compute[192716]: 2025-10-07 22:05:43.089 2 WARNING neutronclient.v2_0.client [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:05:43 compute-0 nova_compute[192716]: 2025-10-07 22:05:43.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.142 2 DEBUG nova.network.neutron [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Updating instance_info_cache with network_info: [{"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.650 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Releasing lock "refresh_cache-598c5a67-7d06-4cd6-a149-2137b69c64f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.651 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Instance network_info: |[{"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.655 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Start _get_guest_xml network_info=[{"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.660 2 WARNING nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.662 2 DEBUG nova.virt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-81417462', uuid='598c5a67-7d06-4cd6-a149-2137b69c64f4'), owner=OwnerMeta(userid='641fbca23ed24b428028d3bc567991bf', username='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin', projectid='faa7f94deef04b67982eaf47a775c225', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874744.6625664) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.668 2 DEBUG nova.virt.libvirt.host [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.669 2 DEBUG nova.virt.libvirt.host [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.674 2 DEBUG nova.virt.libvirt.host [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.675 2 DEBUG nova.virt.libvirt.host [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.675 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.676 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.677 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.677 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.677 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.678 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.678 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.679 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.679 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.680 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.680 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.681 2 DEBUG nova.virt.hardware [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.689 2 DEBUG nova.virt.libvirt.vif [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:05:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-81417462',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-814',id=17,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-02484yh9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:05:36Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=598c5a67-7d06-4cd6-a149-2137b69c64f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.690 2 DEBUG nova.network.os_vif_util [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.691 2 DEBUG nova.network.os_vif_util [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.693 2 DEBUG nova.objects.instance [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lazy-loading 'pci_devices' on Instance uuid 598c5a67-7d06-4cd6-a149-2137b69c64f4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:05:44 compute-0 podman[221600]: 2025-10-07 22:05:44.841129956 +0000 UTC m=+0.077371950 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 07 22:05:44 compute-0 nova_compute[192716]: 2025-10-07 22:05:44.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.201 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] End _get_guest_xml xml=<domain type="kvm">
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <uuid>598c5a67-7d06-4cd6-a149-2137b69c64f4</uuid>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <name>instance-00000011</name>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-81417462</nova:name>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:05:44</nova:creationTime>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:05:45 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:05:45 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:user uuid="641fbca23ed24b428028d3bc567991bf">tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin</nova:user>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:project uuid="faa7f94deef04b67982eaf47a775c225">tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639</nova:project>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         <nova:port uuid="02ca2353-7f96-4a92-ad90-875a2cf33c00">
Oct 07 22:05:45 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <system>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <entry name="serial">598c5a67-7d06-4cd6-a149-2137b69c64f4</entry>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <entry name="uuid">598c5a67-7d06-4cd6-a149-2137b69c64f4</entry>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </system>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <os>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </os>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <features>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </features>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.config"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:63:ff:62"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <target dev="tap02ca2353-7f"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </interface>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/console.log" append="off"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <video>
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </video>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:05:45 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:05:45 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:05:45 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:05:45 compute-0 nova_compute[192716]: </domain>
Oct 07 22:05:45 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.202 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Preparing to wait for external event network-vif-plugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.202 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.203 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.203 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.204 2 DEBUG nova.virt.libvirt.vif [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:05:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-81417462',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-814',id=17,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-02484yh9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:05:36Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=598c5a67-7d06-4cd6-a149-2137b69c64f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.204 2 DEBUG nova.network.os_vif_util [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.205 2 DEBUG nova.network.os_vif_util [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.205 2 DEBUG os_vif [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.208 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '48c71b93-ca56-5d30-847d-a431dfa2cda7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.214 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02ca2353-7f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.214 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap02ca2353-7f, col_values=(('qos', UUID('8a6730bb-023e-4230-b1f8-eed9261cf634')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.214 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap02ca2353-7f, col_values=(('external_ids', {'iface-id': '02ca2353-7f96-4a92-ad90-875a2cf33c00', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:ff:62', 'vm-uuid': '598c5a67-7d06-4cd6-a149-2137b69c64f4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 NetworkManager[51722]: <info>  [1759874745.2172] manager: (tap02ca2353-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:45 compute-0 nova_compute[192716]: 2025-10-07 22:05:45.222 2 INFO os_vif [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f')
Oct 07 22:05:46 compute-0 nova_compute[192716]: 2025-10-07 22:05:46.767 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:05:46 compute-0 nova_compute[192716]: 2025-10-07 22:05:46.767 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:05:46 compute-0 nova_compute[192716]: 2025-10-07 22:05:46.768 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] No VIF found with MAC fa:16:3e:63:ff:62, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 22:05:46 compute-0 nova_compute[192716]: 2025-10-07 22:05:46.768 2 INFO nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Using config drive
Oct 07 22:05:47 compute-0 nova_compute[192716]: 2025-10-07 22:05:47.278 2 WARNING neutronclient.v2_0.client [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:05:47 compute-0 podman[221623]: 2025-10-07 22:05:47.826260844 +0000 UTC m=+0.059522553 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.067 2 INFO nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Creating config drive at /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.config
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.072 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpo28vzk8d execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.219 2 DEBUG oslo_concurrency.processutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpo28vzk8d" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:05:48 compute-0 kernel: tap02ca2353-7f: entered promiscuous mode
Oct 07 22:05:48 compute-0 NetworkManager[51722]: <info>  [1759874748.3155] manager: (tap02ca2353-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Oct 07 22:05:48 compute-0 ovn_controller[94904]: 2025-10-07T22:05:48Z|00154|binding|INFO|Claiming lport 02ca2353-7f96-4a92-ad90-875a2cf33c00 for this chassis.
Oct 07 22:05:48 compute-0 ovn_controller[94904]: 2025-10-07T22:05:48Z|00155|binding|INFO|02ca2353-7f96-4a92-ad90-875a2cf33c00: Claiming fa:16:3e:63:ff:62 10.100.0.7
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.342 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:ff:62 10.100.0.7'], port_security=['fa:16:3e:63:ff:62 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '598c5a67-7d06-4cd6-a149-2137b69c64f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=02ca2353-7f96-4a92-ad90-875a2cf33c00) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.344 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 02ca2353-7f96-4a92-ad90-875a2cf33c00 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 bound to our chassis
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.345 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:05:48 compute-0 systemd-udevd[221661]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:05:48 compute-0 NetworkManager[51722]: <info>  [1759874748.3672] device (tap02ca2353-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.367 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc222eb-5faa-4b7e-9b61-d507cdc1ebc4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 NetworkManager[51722]: <info>  [1759874748.3685] device (tap02ca2353-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.368 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7f17307e-a1 in ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:05:48 compute-0 systemd-machined[152719]: New machine qemu-12-instance-00000011.
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.371 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7f17307e-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.372 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[20025c8f-3e24-4f96-9211-7ba1c3ec6220]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.373 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5d2f0d-b6c8-4850-a635-f1e5691f5fdc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000011.
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 ovn_controller[94904]: 2025-10-07T22:05:48Z|00156|binding|INFO|Setting lport 02ca2353-7f96-4a92-ad90-875a2cf33c00 ovn-installed in OVS
Oct 07 22:05:48 compute-0 ovn_controller[94904]: 2025-10-07T22:05:48Z|00157|binding|INFO|Setting lport 02ca2353-7f96-4a92-ad90-875a2cf33c00 up in Southbound
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.402 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[08d52cad-7a75-436e-8336-df74dc326c9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.409 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[89f0c435-8311-49ca-8f1e-a4151886787d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.447 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[23909bd6-800d-4911-9209-bd52829278e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.453 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[980a129e-c57b-4d20-8a7d-7be6416d54b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 NetworkManager[51722]: <info>  [1759874748.4546] manager: (tap7f17307e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Oct 07 22:05:48 compute-0 systemd-udevd[221665]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.509 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2fdb4ed8-bbdc-484b-89f4-4e2c9e14f6ea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.513 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2501b6fb-a08b-40fb-929d-58cfc13d3e7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 NetworkManager[51722]: <info>  [1759874748.5488] device (tap7f17307e-a0): carrier: link connected
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.561 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[65406a02-f5a7-4bf2-b574-5b93e130d9fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.585 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc723f7-dcd9-47e0-915e-76ebdc5cecbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448213, 'reachable_time': 15746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221695, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.605 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[56d9b77b-a173-431c-8514-ac95a8af1574]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:d3d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448213, 'tstamp': 448213}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221696, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.629 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6cde2fb8-36cb-4588-8f5d-37add095e94c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448213, 'reachable_time': 15746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221697, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.679 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b9381b-a75a-40cd-9264-edc973de7c1a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.747 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[41b11425-218d-49ef-a677-1a810d126aaa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.749 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.749 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.749 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f17307e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 kernel: tap7f17307e-a0: entered promiscuous mode
Oct 07 22:05:48 compute-0 NetworkManager[51722]: <info>  [1759874748.7527] manager: (tap7f17307e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.755 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f17307e-a0, col_values=(('external_ids', {'iface-id': '6865dbad-0588-4cfd-9a22-08a49ea1d5a5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 ovn_controller[94904]: 2025-10-07T22:05:48Z|00158|binding|INFO|Releasing lport 6865dbad-0588-4cfd-9a22-08a49ea1d5a5 from this chassis (sb_readonly=0)
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.760 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7e275b7f-2f1b-41f2-a4c7-8ab845bedaeb]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.761 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.761 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.761 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 7f17307e-ac72-4a6f-8a05-ba2eca705379 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.762 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.762 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[28584a0e-e397-40b2-950f-58441a3ab79e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.763 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.763 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ec37a798-3ced-49a3-abed-a65b4a432ff2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.764 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:05:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:05:48.764 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'env', 'PROCESS_TAG=haproxy-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7f17307e-ac72-4a6f-8a05-ba2eca705379.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.886 2 DEBUG nova.compute.manager [req-1514071f-0a00-494f-8a91-d764841ddca4 req-7a7ac87b-ab63-4c24-bcf2-31c699633252 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-plugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.886 2 DEBUG oslo_concurrency.lockutils [req-1514071f-0a00-494f-8a91-d764841ddca4 req-7a7ac87b-ab63-4c24-bcf2-31c699633252 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.887 2 DEBUG oslo_concurrency.lockutils [req-1514071f-0a00-494f-8a91-d764841ddca4 req-7a7ac87b-ab63-4c24-bcf2-31c699633252 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.887 2 DEBUG oslo_concurrency.lockutils [req-1514071f-0a00-494f-8a91-d764841ddca4 req-7a7ac87b-ab63-4c24-bcf2-31c699633252 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:48 compute-0 nova_compute[192716]: 2025-10-07 22:05:48.888 2 DEBUG nova.compute.manager [req-1514071f-0a00-494f-8a91-d764841ddca4 req-7a7ac87b-ab63-4c24-bcf2-31c699633252 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Processing event network-vif-plugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:05:49 compute-0 podman[221736]: 2025-10-07 22:05:49.164250112 +0000 UTC m=+0.059773650 container create 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.176 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.181 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.185 2 INFO nova.virt.libvirt.driver [-] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Instance spawned successfully.
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.185 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 22:05:49 compute-0 systemd[1]: Started libpod-conmon-6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2.scope.
Oct 07 22:05:49 compute-0 podman[221736]: 2025-10-07 22:05:49.128541399 +0000 UTC m=+0.024064967 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:05:49 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:05:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2eaadd45e6cb47fedd4ee2b85052d3b8d1de553990d8e7a8a835dfead8fdc58a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:05:49 compute-0 podman[221736]: 2025-10-07 22:05:49.289468875 +0000 UTC m=+0.184992443 container init 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 22:05:49 compute-0 podman[221736]: 2025-10-07 22:05:49.297016543 +0000 UTC m=+0.192540081 container start 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 07 22:05:49 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [NOTICE]   (221755) : New worker (221757) forked
Oct 07 22:05:49 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [NOTICE]   (221755) : Loading success.
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.702 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.704 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.704 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.705 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.706 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.706 2 DEBUG nova.virt.libvirt.driver [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:05:49 compute-0 nova_compute[192716]: 2025-10-07 22:05:49.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.219 2 INFO nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Took 12.61 seconds to spawn the instance on the hypervisor.
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.219 2 DEBUG nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.762 2 INFO nova.compute.manager [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Took 17.87 seconds to build instance.
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.974 2 DEBUG nova.compute.manager [req-0afe980b-bd08-4319-a475-9fa3c6cbe7d8 req-538320b1-bca2-45cd-ab08-3b03fd87f1a3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-plugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.975 2 DEBUG oslo_concurrency.lockutils [req-0afe980b-bd08-4319-a475-9fa3c6cbe7d8 req-538320b1-bca2-45cd-ab08-3b03fd87f1a3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.976 2 DEBUG oslo_concurrency.lockutils [req-0afe980b-bd08-4319-a475-9fa3c6cbe7d8 req-538320b1-bca2-45cd-ab08-3b03fd87f1a3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.976 2 DEBUG oslo_concurrency.lockutils [req-0afe980b-bd08-4319-a475-9fa3c6cbe7d8 req-538320b1-bca2-45cd-ab08-3b03fd87f1a3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.977 2 DEBUG nova.compute.manager [req-0afe980b-bd08-4319-a475-9fa3c6cbe7d8 req-538320b1-bca2-45cd-ab08-3b03fd87f1a3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] No waiting events found dispatching network-vif-plugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:05:50 compute-0 nova_compute[192716]: 2025-10-07 22:05:50.977 2 WARNING nova.compute.manager [req-0afe980b-bd08-4319-a475-9fa3c6cbe7d8 req-538320b1-bca2-45cd-ab08-3b03fd87f1a3 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received unexpected event network-vif-plugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 for instance with vm_state active and task_state None.
Oct 07 22:05:51 compute-0 nova_compute[192716]: 2025-10-07 22:05:51.268 2 DEBUG oslo_concurrency.lockutils [None req-4e153120-6e41-409a-8f96-87b31f06ce05 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.392s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:05:54 compute-0 nova_compute[192716]: 2025-10-07 22:05:54.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:55 compute-0 nova_compute[192716]: 2025-10-07 22:05:55.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:05:59 compute-0 podman[203153]: time="2025-10-07T22:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:05:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:05:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3485 "" "Go-http-client/1.1"
Oct 07 22:05:59 compute-0 nova_compute[192716]: 2025-10-07 22:05:59.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:00 compute-0 nova_compute[192716]: 2025-10-07 22:06:00.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:01 compute-0 ovn_controller[94904]: 2025-10-07T22:06:01Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:ff:62 10.100.0.7
Oct 07 22:06:01 compute-0 ovn_controller[94904]: 2025-10-07T22:06:01Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:ff:62 10.100.0.7
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: ERROR   22:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: ERROR   22:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: ERROR   22:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: ERROR   22:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: ERROR   22:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:06:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:06:02 compute-0 podman[221780]: 2025-10-07 22:06:02.854518757 +0000 UTC m=+0.074871797 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:06:02 compute-0 podman[221779]: 2025-10-07 22:06:02.860117609 +0000 UTC m=+0.087059649 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:06:04 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 22:06:04 compute-0 nova_compute[192716]: 2025-10-07 22:06:04.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:05 compute-0 nova_compute[192716]: 2025-10-07 22:06:05.250 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Creating tmpfile /var/lib/nova/instances/tmpe18fgw2b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:06:05 compute-0 nova_compute[192716]: 2025-10-07 22:06:05.252 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:05 compute-0 nova_compute[192716]: 2025-10-07 22:06:05.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:05 compute-0 nova_compute[192716]: 2025-10-07 22:06:05.376 2 DEBUG nova.compute.manager [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe18fgw2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:06:05 compute-0 podman[221823]: 2025-10-07 22:06:05.491444907 +0000 UTC m=+0.087928842 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:06:07 compute-0 nova_compute[192716]: 2025-10-07 22:06:07.416 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:09 compute-0 nova_compute[192716]: 2025-10-07 22:06:09.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:10 compute-0 nova_compute[192716]: 2025-10-07 22:06:10.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:11 compute-0 nova_compute[192716]: 2025-10-07 22:06:11.640 2 DEBUG nova.compute.manager [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe18fgw2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b56e99c5-2492-4e95-845b-7d1cf831bc5b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:06:12 compute-0 nova_compute[192716]: 2025-10-07 22:06:12.661 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-b56e99c5-2492-4e95-845b-7d1cf831bc5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:06:12 compute-0 nova_compute[192716]: 2025-10-07 22:06:12.661 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-b56e99c5-2492-4e95-845b-7d1cf831bc5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:06:12 compute-0 nova_compute[192716]: 2025-10-07 22:06:12.662 2 DEBUG nova.network.neutron [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:06:12 compute-0 podman[221847]: 2025-10-07 22:06:12.892481949 +0000 UTC m=+0.123948965 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Oct 07 22:06:13 compute-0 nova_compute[192716]: 2025-10-07 22:06:13.175 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:14 compute-0 nova_compute[192716]: 2025-10-07 22:06:14.888 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:14 compute-0 nova_compute[192716]: 2025-10-07 22:06:14.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.056 2 DEBUG nova.network.neutron [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Updating instance_info_cache with network_info: [{"id": "3ca69259-25b6-4f2f-885c-64037218e12d", "address": "fa:16:3e:76:18:e7", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca69259-25", "ovs_interfaceid": "3ca69259-25b6-4f2f-885c-64037218e12d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.562 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-b56e99c5-2492-4e95-845b-7d1cf831bc5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.581 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe18fgw2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b56e99c5-2492-4e95-845b-7d1cf831bc5b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.582 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Creating instance directory: /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.582 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Creating disk.info with the contents: {'/var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk': 'qcow2', '/var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.583 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:06:15 compute-0 nova_compute[192716]: 2025-10-07 22:06:15.583 2 DEBUG nova.objects.instance [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b56e99c5-2492-4e95-845b-7d1cf831bc5b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:06:15 compute-0 podman[221874]: 2025-10-07 22:06:15.822843503 +0000 UTC m=+0.056066118 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.090 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.097 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.099 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.190 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.191 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.192 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.193 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.201 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.201 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.257 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.258 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.317 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.318 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.318 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.398 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.399 2 DEBUG nova.virt.disk.api [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.400 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.492 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.494 2 DEBUG nova.virt.disk.api [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:06:16 compute-0 nova_compute[192716]: 2025-10-07 22:06:16.495 2 DEBUG nova.objects.instance [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid b56e99c5-2492-4e95-845b-7d1cf831bc5b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.004 2 DEBUG nova.objects.base [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<b56e99c5-2492-4e95-845b-7d1cf831bc5b> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.004 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.043 2 DEBUG oslo_concurrency.processutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk.config 497664" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.044 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.046 2 DEBUG nova.virt.libvirt.vif [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:05:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1550034253',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-155',id=16,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:05:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-ugwf2qmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:05:26Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=b56e99c5-2492-4e95-845b-7d1cf831bc5b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ca69259-25b6-4f2f-885c-64037218e12d", "address": "fa:16:3e:76:18:e7", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3ca69259-25", "ovs_interfaceid": "3ca69259-25b6-4f2f-885c-64037218e12d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.047 2 DEBUG nova.network.os_vif_util [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "3ca69259-25b6-4f2f-885c-64037218e12d", "address": "fa:16:3e:76:18:e7", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3ca69259-25", "ovs_interfaceid": "3ca69259-25b6-4f2f-885c-64037218e12d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.048 2 DEBUG nova.network.os_vif_util [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:18:e7,bridge_name='br-int',has_traffic_filtering=True,id=3ca69259-25b6-4f2f-885c-64037218e12d,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ca69259-25') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.049 2 DEBUG os_vif [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:18:e7,bridge_name='br-int',has_traffic_filtering=True,id=3ca69259-25b6-4f2f-885c-64037218e12d,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ca69259-25') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.051 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.052 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.053 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '93c6fc4a-954a-547d-bf7e-091947c4a5d9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ca69259-25, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.062 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3ca69259-25, col_values=(('qos', UUID('2d763174-a2b9-44c6-bc9e-5d2339f528b1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3ca69259-25, col_values=(('external_ids', {'iface-id': '3ca69259-25b6-4f2f-885c-64037218e12d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:18:e7', 'vm-uuid': 'b56e99c5-2492-4e95-845b-7d1cf831bc5b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 NetworkManager[51722]: <info>  [1759874777.0661] manager: (tap3ca69259-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.074 2 INFO os_vif [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:18:e7,bridge_name='br-int',has_traffic_filtering=True,id=3ca69259-25b6-4f2f-885c-64037218e12d,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ca69259-25')
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.075 2 DEBUG nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.076 2 DEBUG nova.compute.manager [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe18fgw2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b56e99c5-2492-4e95-845b-7d1cf831bc5b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.077 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:17 compute-0 nova_compute[192716]: 2025-10-07 22:06:17.911 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:18 compute-0 ovn_controller[94904]: 2025-10-07T22:06:18Z|00159|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 07 22:06:18 compute-0 nova_compute[192716]: 2025-10-07 22:06:18.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:18 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:18.765 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:06:18 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:18.766 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:06:18 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:18.770 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:18 compute-0 podman[221913]: 2025-10-07 22:06:18.864038746 +0000 UTC m=+0.088938931 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, version=9.6, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 07 22:06:19 compute-0 nova_compute[192716]: 2025-10-07 22:06:19.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:20 compute-0 nova_compute[192716]: 2025-10-07 22:06:20.049 2 DEBUG nova.network.neutron [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Port 3ca69259-25b6-4f2f-885c-64037218e12d updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:06:20 compute-0 nova_compute[192716]: 2025-10-07 22:06:20.067 2 DEBUG nova.compute.manager [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe18fgw2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b56e99c5-2492-4e95-845b-7d1cf831bc5b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:06:22 compute-0 nova_compute[192716]: 2025-10-07 22:06:22.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:22 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 22:06:22 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 22:06:22 compute-0 kernel: tap3ca69259-25: entered promiscuous mode
Oct 07 22:06:22 compute-0 NetworkManager[51722]: <info>  [1759874782.4567] manager: (tap3ca69259-25): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Oct 07 22:06:22 compute-0 nova_compute[192716]: 2025-10-07 22:06:22.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:22 compute-0 ovn_controller[94904]: 2025-10-07T22:06:22Z|00160|binding|INFO|Claiming lport 3ca69259-25b6-4f2f-885c-64037218e12d for this additional chassis.
Oct 07 22:06:22 compute-0 ovn_controller[94904]: 2025-10-07T22:06:22Z|00161|binding|INFO|3ca69259-25b6-4f2f-885c-64037218e12d: Claiming fa:16:3e:76:18:e7 10.100.0.6
Oct 07 22:06:22 compute-0 systemd-udevd[221964]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.487 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:18:e7 10.100.0.6'], port_security=['fa:16:3e:76:18:e7 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b56e99c5-2492-4e95-845b-7d1cf831bc5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3ca69259-25b6-4f2f-885c-64037218e12d) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:06:22 compute-0 ovn_controller[94904]: 2025-10-07T22:06:22Z|00162|binding|INFO|Setting lport 3ca69259-25b6-4f2f-885c-64037218e12d ovn-installed in OVS
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.489 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 3ca69259-25b6-4f2f-885c-64037218e12d in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 unbound from our chassis
Oct 07 22:06:22 compute-0 nova_compute[192716]: 2025-10-07 22:06:22.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.490 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:06:22 compute-0 nova_compute[192716]: 2025-10-07 22:06:22.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.507 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ea21534c-f94e-43d4-9f18-96e4b977fee5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:22 compute-0 NetworkManager[51722]: <info>  [1759874782.5104] device (tap3ca69259-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:06:22 compute-0 NetworkManager[51722]: <info>  [1759874782.5122] device (tap3ca69259-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:06:22 compute-0 systemd-machined[152719]: New machine qemu-13-instance-00000010.
Oct 07 22:06:22 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000010.
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.565 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[55f5a4ce-c19d-4d0e-b36f-3310c907b296]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.568 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3e9aac-e7c5-4bca-9e17-347941e4fd2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.608 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[80f867de-1ff8-4d98-a248-5e14ff83b35f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.635 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0da15e5e-d2ce-450b-868e-e86476264305]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448213, 'reachable_time': 15746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221979, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.653 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4407013d-58bf-49e5-a83c-542df1f6b1d7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448228, 'tstamp': 448228}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221981, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448231, 'tstamp': 448231}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221981, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.655 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:22 compute-0 nova_compute[192716]: 2025-10-07 22:06:22.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:22 compute-0 nova_compute[192716]: 2025-10-07 22:06:22.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.661 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f17307e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.661 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.662 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f17307e-a0, col_values=(('external_ids', {'iface-id': '6865dbad-0588-4cfd-9a22-08a49ea1d5a5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.662 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:06:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:22.664 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[88804928-4ff5-4c14-bf8f-f0b396c62e0d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-7f17307e-ac72-4a6f-8a05-ba2eca705379\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 7f17307e-ac72-4a6f-8a05-ba2eca705379\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:24 compute-0 nova_compute[192716]: 2025-10-07 22:06:24.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:25 compute-0 ovn_controller[94904]: 2025-10-07T22:06:25Z|00163|binding|INFO|Claiming lport 3ca69259-25b6-4f2f-885c-64037218e12d for this chassis.
Oct 07 22:06:25 compute-0 ovn_controller[94904]: 2025-10-07T22:06:25Z|00164|binding|INFO|3ca69259-25b6-4f2f-885c-64037218e12d: Claiming fa:16:3e:76:18:e7 10.100.0.6
Oct 07 22:06:25 compute-0 ovn_controller[94904]: 2025-10-07T22:06:25Z|00165|binding|INFO|Setting lport 3ca69259-25b6-4f2f-885c-64037218e12d up in Southbound
Oct 07 22:06:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:25.633 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:25.634 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:25.636 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.496 2 INFO nova.compute.manager [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Post operation of migration started
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.497 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.610 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.611 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.808 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-b56e99c5-2492-4e95-845b-7d1cf831bc5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.808 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-b56e99c5-2492-4e95-845b-7d1cf831bc5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:06:26 compute-0 nova_compute[192716]: 2025-10-07 22:06:26.809 2 DEBUG nova.network.neutron [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:06:27 compute-0 nova_compute[192716]: 2025-10-07 22:06:27.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:27 compute-0 nova_compute[192716]: 2025-10-07 22:06:27.326 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:27 compute-0 nova_compute[192716]: 2025-10-07 22:06:27.815 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:27 compute-0 nova_compute[192716]: 2025-10-07 22:06:27.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:28 compute-0 nova_compute[192716]: 2025-10-07 22:06:28.199 2 DEBUG nova.network.neutron [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Updating instance_info_cache with network_info: [{"id": "3ca69259-25b6-4f2f-885c-64037218e12d", "address": "fa:16:3e:76:18:e7", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca69259-25", "ovs_interfaceid": "3ca69259-25b6-4f2f-885c-64037218e12d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:06:28 compute-0 nova_compute[192716]: 2025-10-07 22:06:28.705 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-b56e99c5-2492-4e95-845b-7d1cf831bc5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:06:29 compute-0 nova_compute[192716]: 2025-10-07 22:06:29.227 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:29 compute-0 nova_compute[192716]: 2025-10-07 22:06:29.227 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:29 compute-0 nova_compute[192716]: 2025-10-07 22:06:29.227 2 DEBUG oslo_concurrency.lockutils [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:29 compute-0 nova_compute[192716]: 2025-10-07 22:06:29.233 2 INFO nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:06:29 compute-0 virtqemud[192532]: Domain id=13 name='instance-00000010' uuid=b56e99c5-2492-4e95-845b-7d1cf831bc5b is tainted: custom-monitor
Oct 07 22:06:29 compute-0 podman[203153]: time="2025-10-07T22:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:06:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:06:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Oct 07 22:06:29 compute-0 nova_compute[192716]: 2025-10-07 22:06:29.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:29 compute-0 nova_compute[192716]: 2025-10-07 22:06:29.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:30 compute-0 nova_compute[192716]: 2025-10-07 22:06:30.239 2 INFO nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:06:31 compute-0 nova_compute[192716]: 2025-10-07 22:06:31.246 2 INFO nova.virt.libvirt.driver [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:06:31 compute-0 nova_compute[192716]: 2025-10-07 22:06:31.252 2 DEBUG nova.compute.manager [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: ERROR   22:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: ERROR   22:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: ERROR   22:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: ERROR   22:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: ERROR   22:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:06:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:06:31 compute-0 nova_compute[192716]: 2025-10-07 22:06:31.766 2 DEBUG nova.objects.instance [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:06:32 compute-0 nova_compute[192716]: 2025-10-07 22:06:32.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:32 compute-0 nova_compute[192716]: 2025-10-07 22:06:32.497 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:32 compute-0 nova_compute[192716]: 2025-10-07 22:06:32.498 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:06:32 compute-0 nova_compute[192716]: 2025-10-07 22:06:32.784 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:33 compute-0 nova_compute[192716]: 2025-10-07 22:06:33.069 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:33 compute-0 nova_compute[192716]: 2025-10-07 22:06:33.070 2 WARNING neutronclient.v2_0.client [None req-d608614f-9076-43e0-b726-a6af01948d54 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:33 compute-0 podman[222004]: 2025-10-07 22:06:33.851730671 +0000 UTC m=+0.075280580 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 07 22:06:33 compute-0 podman[222005]: 2025-10-07 22:06:33.868680086 +0000 UTC m=+0.092274196 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:06:33 compute-0 nova_compute[192716]: 2025-10-07 22:06:33.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:33 compute-0 nova_compute[192716]: 2025-10-07 22:06:33.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 22:06:34 compute-0 nova_compute[192716]: 2025-10-07 22:06:34.498 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 22:06:34 compute-0 nova_compute[192716]: 2025-10-07 22:06:34.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:35 compute-0 podman[222046]: 2025-10-07 22:06:35.862053778 +0000 UTC m=+0.086507101 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:06:36 compute-0 nova_compute[192716]: 2025-10-07 22:06:36.494 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:36 compute-0 nova_compute[192716]: 2025-10-07 22:06:36.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:36 compute-0 nova_compute[192716]: 2025-10-07 22:06:36.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.504 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.729 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.730 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.730 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.730 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.730 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:37 compute-0 nova_compute[192716]: 2025-10-07 22:06:37.742 2 INFO nova.compute.manager [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Terminating instance
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.257 2 DEBUG nova.compute.manager [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:06:38 compute-0 kernel: tap02ca2353-7f (unregistering): left promiscuous mode
Oct 07 22:06:38 compute-0 NetworkManager[51722]: <info>  [1759874798.2916] device (tap02ca2353-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:38 compute-0 ovn_controller[94904]: 2025-10-07T22:06:38Z|00166|binding|INFO|Releasing lport 02ca2353-7f96-4a92-ad90-875a2cf33c00 from this chassis (sb_readonly=0)
Oct 07 22:06:38 compute-0 ovn_controller[94904]: 2025-10-07T22:06:38Z|00167|binding|INFO|Setting lport 02ca2353-7f96-4a92-ad90-875a2cf33c00 down in Southbound
Oct 07 22:06:38 compute-0 ovn_controller[94904]: 2025-10-07T22:06:38Z|00168|binding|INFO|Removing iface tap02ca2353-7f ovn-installed in OVS
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.318 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:ff:62 10.100.0.7'], port_security=['fa:16:3e:63:ff:62 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '598c5a67-7d06-4cd6-a149-2137b69c64f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=02ca2353-7f96-4a92-ad90-875a2cf33c00) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.319 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 02ca2353-7f96-4a92-ad90-875a2cf33c00 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 unbound from our chassis
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.320 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.349 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[867779ed-7e34-4640-8c90-317f402fed5c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct 07 22:06:38 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Consumed 13.889s CPU time.
Oct 07 22:06:38 compute-0 systemd-machined[152719]: Machine qemu-12-instance-00000011 terminated.
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.395 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[1dddc433-20e6-4390-a77a-886cabce2726]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.400 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7c7c63-65f7-4d33-8bd9-76ddd05f8bd9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.442 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c58174-9045-42fc-828e-0ac1f02f714b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.463 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[28d011d0-35b6-439e-a2cb-055fb1a54080]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448213, 'reachable_time': 15746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222082, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 NetworkManager[51722]: <info>  [1759874798.4839] manager: (tap02ca2353-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct 07 22:06:38 compute-0 kernel: tap02ca2353-7f: entered promiscuous mode
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.484 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f83ac824-83af-4e0f-b02d-371e506e0c2e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448228, 'tstamp': 448228}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222084, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448231, 'tstamp': 448231}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222084, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 kernel: tap02ca2353-7f (unregistering): left promiscuous mode
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.486 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.506 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f17307e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.507 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.507 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f17307e-a0, col_values=(('external_ids', {'iface-id': '6865dbad-0588-4cfd-9a22-08a49ea1d5a5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.508 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:06:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:38.510 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[94499eab-f2eb-454f-90f4-2bda446d3a81]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-7f17307e-ac72-4a6f-8a05-ba2eca705379\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 7f17307e-ac72-4a6f-8a05-ba2eca705379\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.516 2 DEBUG nova.compute.manager [req-dbaedbb9-a674-4740-bb37-22f437eafb92 req-81329d18-4ef0-46ff-99c9-bc8745920130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-unplugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.517 2 DEBUG oslo_concurrency.lockutils [req-dbaedbb9-a674-4740-bb37-22f437eafb92 req-81329d18-4ef0-46ff-99c9-bc8745920130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.517 2 DEBUG oslo_concurrency.lockutils [req-dbaedbb9-a674-4740-bb37-22f437eafb92 req-81329d18-4ef0-46ff-99c9-bc8745920130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.517 2 DEBUG oslo_concurrency.lockutils [req-dbaedbb9-a674-4740-bb37-22f437eafb92 req-81329d18-4ef0-46ff-99c9-bc8745920130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.518 2 DEBUG nova.compute.manager [req-dbaedbb9-a674-4740-bb37-22f437eafb92 req-81329d18-4ef0-46ff-99c9-bc8745920130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] No waiting events found dispatching network-vif-unplugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.518 2 DEBUG nova.compute.manager [req-dbaedbb9-a674-4740-bb37-22f437eafb92 req-81329d18-4ef0-46ff-99c9-bc8745920130 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-unplugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.559 2 INFO nova.virt.libvirt.driver [-] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Instance destroyed successfully.
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.560 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.572 2 DEBUG nova.objects.instance [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lazy-loading 'resources' on Instance uuid 598c5a67-7d06-4cd6-a149-2137b69c64f4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.642 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.644 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.729 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.739 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.822 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.824 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:38 compute-0 nova_compute[192716]: 2025-10-07 22:06:38.911 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.085 2 DEBUG nova.virt.libvirt.vif [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:05:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-81417462',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-814',id=17,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:05:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-02484yh9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:05:50Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=598c5a67-7d06-4cd6-a149-2137b69c64f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.086 2 DEBUG nova.network.os_vif_util [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "address": "fa:16:3e:63:ff:62", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02ca2353-7f", "ovs_interfaceid": "02ca2353-7f96-4a92-ad90-875a2cf33c00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.087 2 DEBUG nova.network.os_vif_util [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.088 2 DEBUG os_vif [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02ca2353-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.097 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=8a6730bb-023e-4230-b1f8-eed9261cf634) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.101 2 INFO os_vif [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:ff:62,bridge_name='br-int',has_traffic_filtering=True,id=02ca2353-7f96-4a92-ad90-875a2cf33c00,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02ca2353-7f')
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.101 2 INFO nova.virt.libvirt.driver [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Deleting instance files /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4_del
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.102 2 INFO nova.virt.libvirt.driver [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Deletion of /var/lib/nova/instances/598c5a67-7d06-4cd6-a149-2137b69c64f4_del complete
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.113 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.114 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.143 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.145 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5528MB free_disk=73.2457160949707GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.145 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.146 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.628 2 INFO nova.compute.manager [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Took 1.37 seconds to destroy the instance on the hypervisor.
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.629 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.630 2 DEBUG nova.compute.manager [-] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.630 2 DEBUG nova.network.neutron [-] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.631 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:39 compute-0 nova_compute[192716]: 2025-10-07 22:06:39.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.060 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.626 2 DEBUG nova.compute.manager [req-0079550b-695b-4d0f-8b8f-f3beca65e095 req-f678d2d5-2641-416d-be34-d266ffc3849c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-unplugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.627 2 DEBUG oslo_concurrency.lockutils [req-0079550b-695b-4d0f-8b8f-f3beca65e095 req-f678d2d5-2641-416d-be34-d266ffc3849c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.627 2 DEBUG oslo_concurrency.lockutils [req-0079550b-695b-4d0f-8b8f-f3beca65e095 req-f678d2d5-2641-416d-be34-d266ffc3849c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.627 2 DEBUG oslo_concurrency.lockutils [req-0079550b-695b-4d0f-8b8f-f3beca65e095 req-f678d2d5-2641-416d-be34-d266ffc3849c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.627 2 DEBUG nova.compute.manager [req-0079550b-695b-4d0f-8b8f-f3beca65e095 req-f678d2d5-2641-416d-be34-d266ffc3849c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] No waiting events found dispatching network-vif-unplugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.628 2 DEBUG nova.compute.manager [req-0079550b-695b-4d0f-8b8f-f3beca65e095 req-f678d2d5-2641-416d-be34-d266ffc3849c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-unplugged-02ca2353-7f96-4a92-ad90-875a2cf33c00 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.711 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 598c5a67-7d06-4cd6-a149-2137b69c64f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.711 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance b56e99c5-2492-4e95-845b-7d1cf831bc5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.712 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.712 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:06:39 up  1:15,  0 user,  load average: 0.25, 0.20, 0.27\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_faa7f94deef04b67982eaf47a775c225': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.728 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.743 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.744 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.754 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.765 2 DEBUG nova.compute.manager [req-2a084a34-0fea-405e-b01e-a05778bcf821 req-a0cc5d2b-6763-4da3-8d93-5ae4c8ad7ba4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Received event network-vif-deleted-02ca2353-7f96-4a92-ad90-875a2cf33c00 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.765 2 INFO nova.compute.manager [req-2a084a34-0fea-405e-b01e-a05778bcf821 req-a0cc5d2b-6763-4da3-8d93-5ae4c8ad7ba4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Neutron deleted interface 02ca2353-7f96-4a92-ad90-875a2cf33c00; detaching it from the instance and deleting it from the info cache
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.766 2 DEBUG nova.network.neutron [req-2a084a34-0fea-405e-b01e-a05778bcf821 req-a0cc5d2b-6763-4da3-8d93-5ae4c8ad7ba4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.769 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 22:06:40 compute-0 nova_compute[192716]: 2025-10-07 22:06:40.815 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:06:41 compute-0 nova_compute[192716]: 2025-10-07 22:06:41.138 2 DEBUG nova.network.neutron [-] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:06:41 compute-0 nova_compute[192716]: 2025-10-07 22:06:41.277 2 DEBUG nova.compute.manager [req-2a084a34-0fea-405e-b01e-a05778bcf821 req-a0cc5d2b-6763-4da3-8d93-5ae4c8ad7ba4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Detach interface failed, port_id=02ca2353-7f96-4a92-ad90-875a2cf33c00, reason: Instance 598c5a67-7d06-4cd6-a149-2137b69c64f4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:06:41 compute-0 nova_compute[192716]: 2025-10-07 22:06:41.347 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:06:41 compute-0 nova_compute[192716]: 2025-10-07 22:06:41.644 2 INFO nova.compute.manager [-] [instance: 598c5a67-7d06-4cd6-a149-2137b69c64f4] Took 2.01 seconds to deallocate network for instance.
Oct 07 22:06:41 compute-0 nova_compute[192716]: 2025-10-07 22:06:41.859 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:06:41 compute-0 nova_compute[192716]: 2025-10-07 22:06:41.859 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.713s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:42 compute-0 nova_compute[192716]: 2025-10-07 22:06:42.167 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:42 compute-0 nova_compute[192716]: 2025-10-07 22:06:42.168 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:42 compute-0 nova_compute[192716]: 2025-10-07 22:06:42.233 2 DEBUG nova.compute.provider_tree [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:06:42 compute-0 nova_compute[192716]: 2025-10-07 22:06:42.743 2 DEBUG nova.scheduler.client.report [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.256 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.283 2 INFO nova.scheduler.client.report [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Deleted allocations for instance 598c5a67-7d06-4cd6-a149-2137b69c64f4
Oct 07 22:06:43 compute-0 podman[222109]: 2025-10-07 22:06:43.926392028 +0000 UTC m=+0.154598383 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:06:43 compute-0 nova_compute[192716]: 2025-10-07 22:06:43.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 22:06:44 compute-0 nova_compute[192716]: 2025-10-07 22:06:44.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:44 compute-0 nova_compute[192716]: 2025-10-07 22:06:44.313 2 DEBUG oslo_concurrency.lockutils [None req-6c11772a-be16-4978-8545-8d1311993683 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "598c5a67-7d06-4cd6-a149-2137b69c64f4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.583s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:44 compute-0 nova_compute[192716]: 2025-10-07 22:06:44.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:45 compute-0 nova_compute[192716]: 2025-10-07 22:06:45.860 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:45 compute-0 nova_compute[192716]: 2025-10-07 22:06:45.861 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:45 compute-0 nova_compute[192716]: 2025-10-07 22:06:45.862 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:45 compute-0 nova_compute[192716]: 2025-10-07 22:06:45.862 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:45 compute-0 nova_compute[192716]: 2025-10-07 22:06:45.863 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:45 compute-0 nova_compute[192716]: 2025-10-07 22:06:45.881 2 INFO nova.compute.manager [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Terminating instance
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.404 2 DEBUG nova.compute.manager [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:06:46 compute-0 kernel: tap3ca69259-25 (unregistering): left promiscuous mode
Oct 07 22:06:46 compute-0 NetworkManager[51722]: <info>  [1759874806.4335] device (tap3ca69259-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:06:46 compute-0 ovn_controller[94904]: 2025-10-07T22:06:46Z|00169|binding|INFO|Releasing lport 3ca69259-25b6-4f2f-885c-64037218e12d from this chassis (sb_readonly=0)
Oct 07 22:06:46 compute-0 ovn_controller[94904]: 2025-10-07T22:06:46Z|00170|binding|INFO|Setting lport 3ca69259-25b6-4f2f-885c-64037218e12d down in Southbound
Oct 07 22:06:46 compute-0 ovn_controller[94904]: 2025-10-07T22:06:46Z|00171|binding|INFO|Removing iface tap3ca69259-25 ovn-installed in OVS
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.466 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:18:e7 10.100.0.6'], port_security=['fa:16:3e:76:18:e7 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b56e99c5-2492-4e95-845b-7d1cf831bc5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '15', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=3ca69259-25b6-4f2f-885c-64037218e12d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.468 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 3ca69259-25b6-4f2f-885c-64037218e12d in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 unbound from our chassis
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.469 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f17307e-ac72-4a6f-8a05-ba2eca705379, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.470 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a83143-63d6-4fd0-a1a3-d7e3a9fd39df]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.471 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 namespace which is not needed anymore
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:46 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Deactivated successfully.
Oct 07 22:06:46 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Consumed 2.600s CPU time.
Oct 07 22:06:46 compute-0 systemd-machined[152719]: Machine qemu-13-instance-00000010 terminated.
Oct 07 22:06:46 compute-0 podman[222136]: 2025-10-07 22:06:46.54508633 +0000 UTC m=+0.090543067 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 07 22:06:46 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [NOTICE]   (221755) : haproxy version is 3.0.5-8e879a5
Oct 07 22:06:46 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [NOTICE]   (221755) : path to executable is /usr/sbin/haproxy
Oct 07 22:06:46 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [WARNING]  (221755) : Exiting Master process...
Oct 07 22:06:46 compute-0 podman[222177]: 2025-10-07 22:06:46.606803539 +0000 UTC m=+0.030520756 container kill 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 22:06:46 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [ALERT]    (221755) : Current worker (221757) exited with code 143 (Terminated)
Oct 07 22:06:46 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[221751]: [WARNING]  (221755) : All workers exited. Exiting... (0)
Oct 07 22:06:46 compute-0 systemd[1]: libpod-6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2.scope: Deactivated successfully.
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.633 2 DEBUG nova.compute.manager [req-684a1f9b-6135-4777-9ad0-6494c671bb70 req-92c1eb89-4108-49d9-a823-91c136baa5f4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Received event network-vif-unplugged-3ca69259-25b6-4f2f-885c-64037218e12d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.634 2 DEBUG oslo_concurrency.lockutils [req-684a1f9b-6135-4777-9ad0-6494c671bb70 req-92c1eb89-4108-49d9-a823-91c136baa5f4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.634 2 DEBUG oslo_concurrency.lockutils [req-684a1f9b-6135-4777-9ad0-6494c671bb70 req-92c1eb89-4108-49d9-a823-91c136baa5f4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.635 2 DEBUG oslo_concurrency.lockutils [req-684a1f9b-6135-4777-9ad0-6494c671bb70 req-92c1eb89-4108-49d9-a823-91c136baa5f4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.635 2 DEBUG nova.compute.manager [req-684a1f9b-6135-4777-9ad0-6494c671bb70 req-92c1eb89-4108-49d9-a823-91c136baa5f4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] No waiting events found dispatching network-vif-unplugged-3ca69259-25b6-4f2f-885c-64037218e12d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.635 2 DEBUG nova.compute.manager [req-684a1f9b-6135-4777-9ad0-6494c671bb70 req-92c1eb89-4108-49d9-a823-91c136baa5f4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Received event network-vif-unplugged-3ca69259-25b6-4f2f-885c-64037218e12d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:06:46 compute-0 podman[222193]: 2025-10-07 22:06:46.677270549 +0000 UTC m=+0.039324078 container died 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007)
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.689 2 INFO nova.virt.libvirt.driver [-] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Instance destroyed successfully.
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.690 2 DEBUG nova.objects.instance [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lazy-loading 'resources' on Instance uuid b56e99c5-2492-4e95-845b-7d1cf831bc5b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:06:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2-userdata-shm.mount: Deactivated successfully.
Oct 07 22:06:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-2eaadd45e6cb47fedd4ee2b85052d3b8d1de553990d8e7a8a835dfead8fdc58a-merged.mount: Deactivated successfully.
Oct 07 22:06:46 compute-0 podman[222193]: 2025-10-07 22:06:46.709993508 +0000 UTC m=+0.072046957 container cleanup 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:06:46 compute-0 systemd[1]: libpod-conmon-6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2.scope: Deactivated successfully.
Oct 07 22:06:46 compute-0 podman[222197]: 2025-10-07 22:06:46.730562817 +0000 UTC m=+0.085253875 container remove 6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.739 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb02145-8679-4125-9bdc-62bc0f513448]: (4, ("Tue Oct  7 10:06:46 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 (6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2)\n6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2\nTue Oct  7 10:06:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 (6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2)\n6def01934303585e3e21b4b0909e6c702ca230431e53c1a23f06d8ac984eb6c2\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.742 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[457f97ed-569c-4a91-b82b-6e3e32545153]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.743 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.743 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e69e26bd-e409-4670-a7ed-73282c278662]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.744 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:46 compute-0 kernel: tap7f17307e-a0: left promiscuous mode
Oct 07 22:06:46 compute-0 nova_compute[192716]: 2025-10-07 22:06:46.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.770 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ce96b62f-597d-4f2d-a2dc-9eff37acfb50]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.811 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1111fe4e-0fa1-4309-a422-9065a7796b1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.812 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0586e3-5d0d-44bb-90f0-3deb5f963adc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.827 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[42f619a0-c691-4c26-a739-f2d4f16e11cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448201, 'reachable_time': 42145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222245, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d7f17307e\x2dac72\x2d4a6f\x2d8a05\x2dba2eca705379.mount: Deactivated successfully.
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.832 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:06:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:06:46.833 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2b9923-0482-46e4-8f5a-07a9e282d805]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.199 2 DEBUG nova.virt.libvirt.vif [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:05:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1550034253',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-155',id=16,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:05:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-ugwf2qmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:06:32Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=b56e99c5-2492-4e95-845b-7d1cf831bc5b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ca69259-25b6-4f2f-885c-64037218e12d", "address": "fa:16:3e:76:18:e7", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca69259-25", "ovs_interfaceid": "3ca69259-25b6-4f2f-885c-64037218e12d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.199 2 DEBUG nova.network.os_vif_util [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "3ca69259-25b6-4f2f-885c-64037218e12d", "address": "fa:16:3e:76:18:e7", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca69259-25", "ovs_interfaceid": "3ca69259-25b6-4f2f-885c-64037218e12d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.200 2 DEBUG nova.network.os_vif_util [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:18:e7,bridge_name='br-int',has_traffic_filtering=True,id=3ca69259-25b6-4f2f-885c-64037218e12d,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ca69259-25') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.200 2 DEBUG os_vif [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:18:e7,bridge_name='br-int',has_traffic_filtering=True,id=3ca69259-25b6-4f2f-885c-64037218e12d,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ca69259-25') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.202 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ca69259-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.207 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2d763174-a2b9-44c6-bc9e-5d2339f528b1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.211 2 INFO os_vif [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:18:e7,bridge_name='br-int',has_traffic_filtering=True,id=3ca69259-25b6-4f2f-885c-64037218e12d,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ca69259-25')
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.212 2 INFO nova.virt.libvirt.driver [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Deleting instance files /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b_del
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.212 2 INFO nova.virt.libvirt.driver [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Deletion of /var/lib/nova/instances/b56e99c5-2492-4e95-845b-7d1cf831bc5b_del complete
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.723 2 INFO nova.compute.manager [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.723 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.724 2 DEBUG nova.compute.manager [-] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.724 2 DEBUG nova.network.neutron [-] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.725 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:47 compute-0 nova_compute[192716]: 2025-10-07 22:06:47.895 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.692 2 DEBUG nova.network.neutron [-] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.698 2 DEBUG nova.compute.manager [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Received event network-vif-unplugged-3ca69259-25b6-4f2f-885c-64037218e12d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.699 2 DEBUG oslo_concurrency.lockutils [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.699 2 DEBUG oslo_concurrency.lockutils [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.699 2 DEBUG oslo_concurrency.lockutils [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.699 2 DEBUG nova.compute.manager [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] No waiting events found dispatching network-vif-unplugged-3ca69259-25b6-4f2f-885c-64037218e12d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.700 2 DEBUG nova.compute.manager [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Received event network-vif-unplugged-3ca69259-25b6-4f2f-885c-64037218e12d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.700 2 DEBUG nova.compute.manager [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Received event network-vif-deleted-3ca69259-25b6-4f2f-885c-64037218e12d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.700 2 INFO nova.compute.manager [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Neutron deleted interface 3ca69259-25b6-4f2f-885c-64037218e12d; detaching it from the instance and deleting it from the info cache
Oct 07 22:06:48 compute-0 nova_compute[192716]: 2025-10-07 22:06:48.700 2 DEBUG nova.network.neutron [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:06:49 compute-0 nova_compute[192716]: 2025-10-07 22:06:49.203 2 INFO nova.compute.manager [-] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Took 1.48 seconds to deallocate network for instance.
Oct 07 22:06:49 compute-0 nova_compute[192716]: 2025-10-07 22:06:49.207 2 DEBUG nova.compute.manager [req-b1df20b3-ab7b-46b0-9adf-ced496dd7fd8 req-19c31c50-8a2c-40a4-bee4-96f992fc6ec7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b56e99c5-2492-4e95-845b-7d1cf831bc5b] Detach interface failed, port_id=3ca69259-25b6-4f2f-885c-64037218e12d, reason: Instance b56e99c5-2492-4e95-845b-7d1cf831bc5b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:06:49 compute-0 nova_compute[192716]: 2025-10-07 22:06:49.726 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:06:49 compute-0 nova_compute[192716]: 2025-10-07 22:06:49.727 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:06:49 compute-0 nova_compute[192716]: 2025-10-07 22:06:49.797 2 DEBUG nova.compute.provider_tree [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:06:49 compute-0 podman[222246]: 2025-10-07 22:06:49.868275438 +0000 UTC m=+0.099103782 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Oct 07 22:06:49 compute-0 nova_compute[192716]: 2025-10-07 22:06:49.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:50 compute-0 nova_compute[192716]: 2025-10-07 22:06:50.308 2 DEBUG nova.scheduler.client.report [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:06:50 compute-0 nova_compute[192716]: 2025-10-07 22:06:50.821 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:50 compute-0 nova_compute[192716]: 2025-10-07 22:06:50.851 2 INFO nova.scheduler.client.report [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Deleted allocations for instance b56e99c5-2492-4e95-845b-7d1cf831bc5b
Oct 07 22:06:51 compute-0 nova_compute[192716]: 2025-10-07 22:06:51.893 2 DEBUG oslo_concurrency.lockutils [None req-919afe16-d0ea-4ab7-988d-da9ace9767e0 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "b56e99c5-2492-4e95-845b-7d1cf831bc5b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.032s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:06:52 compute-0 nova_compute[192716]: 2025-10-07 22:06:52.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:54 compute-0 nova_compute[192716]: 2025-10-07 22:06:54.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:57 compute-0 nova_compute[192716]: 2025-10-07 22:06:57.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:06:59 compute-0 podman[203153]: time="2025-10-07T22:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:06:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:06:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 07 22:06:59 compute-0 nova_compute[192716]: 2025-10-07 22:06:59.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: ERROR   22:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: ERROR   22:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: ERROR   22:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: ERROR   22:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: ERROR   22:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:07:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:07:02 compute-0 nova_compute[192716]: 2025-10-07 22:07:02.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:04 compute-0 podman[222272]: 2025-10-07 22:07:04.835770645 +0000 UTC m=+0.069982017 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:07:04 compute-0 podman[222271]: 2025-10-07 22:07:04.842095847 +0000 UTC m=+0.072043627 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid)
Oct 07 22:07:05 compute-0 nova_compute[192716]: 2025-10-07 22:07:05.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:06 compute-0 podman[222313]: 2025-10-07 22:07:06.825770149 +0000 UTC m=+0.061844124 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:07:07 compute-0 nova_compute[192716]: 2025-10-07 22:07:07.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:08 compute-0 sshd-session[222311]: Invalid user bot from 103.115.24.11 port 47340
Oct 07 22:07:08 compute-0 sshd-session[222311]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:07:08 compute-0 sshd-session[222311]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:07:09 compute-0 sshd-session[222311]: Failed password for invalid user bot from 103.115.24.11 port 47340 ssh2
Oct 07 22:07:10 compute-0 nova_compute[192716]: 2025-10-07 22:07:10.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:10 compute-0 sshd-session[222311]: Received disconnect from 103.115.24.11 port 47340:11: Bye Bye [preauth]
Oct 07 22:07:10 compute-0 sshd-session[222311]: Disconnected from invalid user bot 103.115.24.11 port 47340 [preauth]
Oct 07 22:07:12 compute-0 nova_compute[192716]: 2025-10-07 22:07:12.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:14 compute-0 podman[222337]: 2025-10-07 22:07:14.869628571 +0000 UTC m=+0.109534641 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 07 22:07:15 compute-0 nova_compute[192716]: 2025-10-07 22:07:15.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:16 compute-0 podman[222363]: 2025-10-07 22:07:16.827389191 +0000 UTC m=+0.067595499 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 07 22:07:17 compute-0 nova_compute[192716]: 2025-10-07 22:07:17.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:20 compute-0 nova_compute[192716]: 2025-10-07 22:07:20.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:20 compute-0 nova_compute[192716]: 2025-10-07 22:07:20.123 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:20 compute-0 nova_compute[192716]: 2025-10-07 22:07:20.124 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:20 compute-0 nova_compute[192716]: 2025-10-07 22:07:20.630 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 22:07:20 compute-0 podman[222383]: 2025-10-07 22:07:20.830765231 +0000 UTC m=+0.066991972 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Oct 07 22:07:21 compute-0 nova_compute[192716]: 2025-10-07 22:07:21.193 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:21 compute-0 nova_compute[192716]: 2025-10-07 22:07:21.194 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:21 compute-0 nova_compute[192716]: 2025-10-07 22:07:21.205 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 22:07:21 compute-0 nova_compute[192716]: 2025-10-07 22:07:21.206 2 INFO nova.compute.claims [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Claim successful on node compute-0.ctlplane.example.com
Oct 07 22:07:22 compute-0 nova_compute[192716]: 2025-10-07 22:07:22.261 2 DEBUG nova.compute.provider_tree [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:07:22 compute-0 nova_compute[192716]: 2025-10-07 22:07:22.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:22 compute-0 nova_compute[192716]: 2025-10-07 22:07:22.767 2 DEBUG nova.scheduler.client.report [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:07:23 compute-0 nova_compute[192716]: 2025-10-07 22:07:23.277 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:23 compute-0 nova_compute[192716]: 2025-10-07 22:07:23.278 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 22:07:23 compute-0 nova_compute[192716]: 2025-10-07 22:07:23.792 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 22:07:23 compute-0 nova_compute[192716]: 2025-10-07 22:07:23.792 2 DEBUG nova.network.neutron [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 22:07:23 compute-0 nova_compute[192716]: 2025-10-07 22:07:23.793 2 WARNING neutronclient.v2_0.client [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:23 compute-0 nova_compute[192716]: 2025-10-07 22:07:23.794 2 WARNING neutronclient.v2_0.client [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:24 compute-0 nova_compute[192716]: 2025-10-07 22:07:24.304 2 INFO nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 22:07:24 compute-0 nova_compute[192716]: 2025-10-07 22:07:24.566 2 DEBUG nova.network.neutron [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Successfully created port: dae0547a-45c6-4b1f-bd90-f18af339dcb3 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 22:07:24 compute-0 nova_compute[192716]: 2025-10-07 22:07:24.813 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.396 2 DEBUG nova.network.neutron [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Successfully updated port: dae0547a-45c6-4b1f-bd90-f18af339dcb3 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.476 2 DEBUG nova.compute.manager [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-changed-dae0547a-45c6-4b1f-bd90-f18af339dcb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.476 2 DEBUG nova.compute.manager [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Refreshing instance network info cache due to event network-changed-dae0547a-45c6-4b1f-bd90-f18af339dcb3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.477 2 DEBUG oslo_concurrency.lockutils [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-3158f0ab-25fe-4a1a-8c95-8d9b702e260b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.477 2 DEBUG oslo_concurrency.lockutils [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-3158f0ab-25fe-4a1a-8c95-8d9b702e260b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.477 2 DEBUG nova.network.neutron [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Refreshing network info cache for port dae0547a-45c6-4b1f-bd90-f18af339dcb3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:07:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:25.637 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:25.637 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:25.638 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.830 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.832 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.833 2 INFO nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Creating image(s)
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.834 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "/var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.834 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "/var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.835 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "/var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.835 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.838 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.839 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.906 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "refresh_cache-3158f0ab-25fe-4a1a-8c95-8d9b702e260b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.922 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.922 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.923 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.923 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.927 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.928 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:25 compute-0 nova_compute[192716]: 2025-10-07 22:07:25.984 2 WARNING neutronclient.v2_0.client [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.001 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.002 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.056 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.058 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.058 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.136 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.137 2 DEBUG nova.virt.disk.api [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Checking if we can resize image /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.138 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.200 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.201 2 DEBUG nova.virt.disk.api [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Cannot resize image /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.202 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.202 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Ensure instance console log exists: /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.203 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.203 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:26 compute-0 nova_compute[192716]: 2025-10-07 22:07:26.203 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:27 compute-0 nova_compute[192716]: 2025-10-07 22:07:27.065 2 DEBUG nova.network.neutron [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:07:27 compute-0 nova_compute[192716]: 2025-10-07 22:07:27.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:27 compute-0 nova_compute[192716]: 2025-10-07 22:07:27.992 2 DEBUG nova.network.neutron [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:07:28 compute-0 nova_compute[192716]: 2025-10-07 22:07:28.500 2 DEBUG oslo_concurrency.lockutils [req-07c1cbad-878a-4834-a222-7e79e16b07ef req-6099cd9d-15bb-46d4-833a-b4096d1d1a7a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-3158f0ab-25fe-4a1a-8c95-8d9b702e260b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:07:28 compute-0 nova_compute[192716]: 2025-10-07 22:07:28.501 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquired lock "refresh_cache-3158f0ab-25fe-4a1a-8c95-8d9b702e260b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:07:28 compute-0 nova_compute[192716]: 2025-10-07 22:07:28.501 2 DEBUG nova.network.neutron [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:07:29 compute-0 podman[203153]: time="2025-10-07T22:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:07:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:07:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 07 22:07:30 compute-0 nova_compute[192716]: 2025-10-07 22:07:30.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:30 compute-0 nova_compute[192716]: 2025-10-07 22:07:30.074 2 DEBUG nova.network.neutron [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:07:30 compute-0 nova_compute[192716]: 2025-10-07 22:07:30.497 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: ERROR   22:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: ERROR   22:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: ERROR   22:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: ERROR   22:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: ERROR   22:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:07:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:07:31 compute-0 nova_compute[192716]: 2025-10-07 22:07:31.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:31 compute-0 nova_compute[192716]: 2025-10-07 22:07:31.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:07:32 compute-0 nova_compute[192716]: 2025-10-07 22:07:32.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:33 compute-0 nova_compute[192716]: 2025-10-07 22:07:33.049 2 WARNING neutronclient.v2_0.client [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.058 2 DEBUG nova.network.neutron [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Updating instance_info_cache with network_info: [{"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.566 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Releasing lock "refresh_cache-3158f0ab-25fe-4a1a-8c95-8d9b702e260b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.567 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Instance network_info: |[{"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.571 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Start _get_guest_xml network_info=[{"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.577 2 WARNING nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.579 2 DEBUG nova.virt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2006630721', uuid='3158f0ab-25fe-4a1a-8c95-8d9b702e260b'), owner=OwnerMeta(userid='641fbca23ed24b428028d3bc567991bf', username='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin', projectid='faa7f94deef04b67982eaf47a775c225', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874854.5794961) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.594 2 DEBUG nova.virt.libvirt.host [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.596 2 DEBUG nova.virt.libvirt.host [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.601 2 DEBUG nova.virt.libvirt.host [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.602 2 DEBUG nova.virt.libvirt.host [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.603 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.603 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.603 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.604 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.604 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.604 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.604 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.605 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.605 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.605 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.605 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.606 2 DEBUG nova.virt.hardware [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.611 2 DEBUG nova.virt.libvirt.vif [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2006630721',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-200',id=19,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-1zu2dw75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:07:24Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=3158f0ab-25fe-4a1a-8c95-8d9b702e260b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.611 2 DEBUG nova.network.os_vif_util [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.612 2 DEBUG nova.network.os_vif_util [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:07:34 compute-0 nova_compute[192716]: 2025-10-07 22:07:34.613 2 DEBUG nova.objects.instance [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3158f0ab-25fe-4a1a-8c95-8d9b702e260b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.122 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] End _get_guest_xml xml=<domain type="kvm">
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <uuid>3158f0ab-25fe-4a1a-8c95-8d9b702e260b</uuid>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <name>instance-00000013</name>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-2006630721</nova:name>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:07:34</nova:creationTime>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:07:35 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:07:35 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:user uuid="641fbca23ed24b428028d3bc567991bf">tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin</nova:user>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:project uuid="faa7f94deef04b67982eaf47a775c225">tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639</nova:project>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         <nova:port uuid="dae0547a-45c6-4b1f-bd90-f18af339dcb3">
Oct 07 22:07:35 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <system>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <entry name="serial">3158f0ab-25fe-4a1a-8c95-8d9b702e260b</entry>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <entry name="uuid">3158f0ab-25fe-4a1a-8c95-8d9b702e260b</entry>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </system>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <os>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </os>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <features>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </features>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.config"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:70:80:f5"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <target dev="tapdae0547a-45"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </interface>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/console.log" append="off"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <video>
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </video>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:07:35 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:07:35 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:07:35 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:07:35 compute-0 nova_compute[192716]: </domain>
Oct 07 22:07:35 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.125 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Preparing to wait for external event network-vif-plugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.126 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.126 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.126 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.128 2 DEBUG nova.virt.libvirt.vif [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2006630721',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-200',id=19,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-1zu2dw75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:07:24Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=3158f0ab-25fe-4a1a-8c95-8d9b702e260b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.128 2 DEBUG nova.network.os_vif_util [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.129 2 DEBUG nova.network.os_vif_util [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.130 2 DEBUG os_vif [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.132 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.132 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.134 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c91538f4-32c4-5ab4-bdd6-18f658b37d11', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.145 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdae0547a-45, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.146 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdae0547a-45, col_values=(('qos', UUID('b8fa3d8a-6bbf-4124-a430-a081988c157d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.146 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdae0547a-45, col_values=(('external_ids', {'iface-id': 'dae0547a-45c6-4b1f-bd90-f18af339dcb3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:80:f5', 'vm-uuid': '3158f0ab-25fe-4a1a-8c95-8d9b702e260b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 NetworkManager[51722]: <info>  [1759874855.1510] manager: (tapdae0547a-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.161 2 INFO os_vif [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45')
Oct 07 22:07:35 compute-0 podman[222422]: 2025-10-07 22:07:35.27194374 +0000 UTC m=+0.073045316 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:07:35 compute-0 podman[222424]: 2025-10-07 22:07:35.312994547 +0000 UTC m=+0.106436833 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007)
Oct 07 22:07:35 compute-0 nova_compute[192716]: 2025-10-07 22:07:35.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:36 compute-0 nova_compute[192716]: 2025-10-07 22:07:36.740 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:07:36 compute-0 nova_compute[192716]: 2025-10-07 22:07:36.741 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:07:36 compute-0 nova_compute[192716]: 2025-10-07 22:07:36.741 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] No VIF found with MAC fa:16:3e:70:80:f5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 22:07:36 compute-0 nova_compute[192716]: 2025-10-07 22:07:36.742 2 INFO nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Using config drive
Oct 07 22:07:37 compute-0 nova_compute[192716]: 2025-10-07 22:07:37.256 2 WARNING neutronclient.v2_0.client [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:37 compute-0 podman[222464]: 2025-10-07 22:07:37.821046094 +0000 UTC m=+0.065340475 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:07:37 compute-0 nova_compute[192716]: 2025-10-07 22:07:37.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.020 2 INFO nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Creating config drive at /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.config
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.031 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpi8qcz2rc execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.167 2 DEBUG oslo_concurrency.processutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpi8qcz2rc" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:38 compute-0 kernel: tapdae0547a-45: entered promiscuous mode
Oct 07 22:07:38 compute-0 NetworkManager[51722]: <info>  [1759874858.2706] manager: (tapdae0547a-45): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:38 compute-0 ovn_controller[94904]: 2025-10-07T22:07:38Z|00172|binding|INFO|Claiming lport dae0547a-45c6-4b1f-bd90-f18af339dcb3 for this chassis.
Oct 07 22:07:38 compute-0 ovn_controller[94904]: 2025-10-07T22:07:38Z|00173|binding|INFO|dae0547a-45c6-4b1f-bd90-f18af339dcb3: Claiming fa:16:3e:70:80:f5 10.100.0.7
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.289 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:80:f5 10.100.0.7'], port_security=['fa:16:3e:70:80:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3158f0ab-25fe-4a1a-8c95-8d9b702e260b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=dae0547a-45c6-4b1f-bd90-f18af339dcb3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.290 103791 INFO neutron.agent.ovn.metadata.agent [-] Port dae0547a-45c6-4b1f-bd90-f18af339dcb3 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 bound to our chassis
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.292 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:07:38 compute-0 ovn_controller[94904]: 2025-10-07T22:07:38Z|00174|binding|INFO|Setting lport dae0547a-45c6-4b1f-bd90-f18af339dcb3 ovn-installed in OVS
Oct 07 22:07:38 compute-0 ovn_controller[94904]: 2025-10-07T22:07:38Z|00175|binding|INFO|Setting lport dae0547a-45c6-4b1f-bd90-f18af339dcb3 up in Southbound
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:38 compute-0 systemd-udevd[222504]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.307 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[108a7929-b25f-4067-8250-e7bbb0965de3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.311 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7f17307e-a1 in ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.313 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7f17307e-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.313 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[459b4cf8-825d-467d-80f7-d0789d08620c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.314 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[95a7e0d5-6b63-49ce-a29a-8f7c3f64b439]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 NetworkManager[51722]: <info>  [1759874858.3233] device (tapdae0547a-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:07:38 compute-0 NetworkManager[51722]: <info>  [1759874858.3241] device (tapdae0547a-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.329 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf490fc-8a59-4ab6-8b44-a614032a9819]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 systemd-machined[152719]: New machine qemu-14-instance-00000013.
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.336 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0581cd-c2a1-4d95-b800-9ff4975f3740]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.373 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[f27e5f24-7441-496c-9dd0-228f2fe4aa58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.377 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed6030e-9267-4c90-93ac-b24b3fb01d03]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 systemd-udevd[222510]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:07:38 compute-0 NetworkManager[51722]: <info>  [1759874858.3795] manager: (tap7f17307e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.412 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[c114d7dc-5cf0-4b00-b4f1-e0a691361091]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.414 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e358ce15-c7a1-4cd5-8fc4-163656004ee6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 NetworkManager[51722]: <info>  [1759874858.4394] device (tap7f17307e-a0): carrier: link connected
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.443 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[d703fc58-f864-429e-9078-53b833bd6a0b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.459 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[5e960641-ee09-4cd5-84cc-2ddfb9192940]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459202, 'reachable_time': 23745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222539, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.477 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[22adaf71-4087-4118-9f04-b345bb41c79f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:d3d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459202, 'tstamp': 459202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222540, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.491 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[83779270-bd1d-435b-9c72-892d97d12a33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459202, 'reachable_time': 23745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222541, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.530 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c133d2-a001-4f0d-b3a1-365ab8d19748]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.547 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.548 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.548 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.548 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.604 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb6dead-2a19-44c9-a00a-eb1c42a9c45a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.605 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.606 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.606 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f17307e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:38 compute-0 NetworkManager[51722]: <info>  [1759874858.6265] manager: (tap7f17307e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Oct 07 22:07:38 compute-0 kernel: tap7f17307e-a0: entered promiscuous mode
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.632 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f17307e-a0, col_values=(('external_ids', {'iface-id': '6865dbad-0588-4cfd-9a22-08a49ea1d5a5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:38 compute-0 ovn_controller[94904]: 2025-10-07T22:07:38Z|00176|binding|INFO|Releasing lport 6865dbad-0588-4cfd-9a22-08a49ea1d5a5 from this chassis (sb_readonly=0)
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.660 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[30cfce17-8558-4d3c-aab2-60e85d09d1b7]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.661 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.662 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.662 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 7f17307e-ac72-4a6f-8a05-ba2eca705379 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.662 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.663 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[239d0f49-c555-4a8f-94cd-b9c0d8171c95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.663 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.664 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2fcd29-2801-496b-a1dd-da3683001a71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.664 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.667 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'env', 'PROCESS_TAG=haproxy-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7f17307e-ac72-4a6f-8a05-ba2eca705379.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.880 2 DEBUG nova.compute.manager [req-9dad49f0-1fc6-4d7d-bb99-a34926f29c9a req-f9c7b3f3-cbf9-4ed2-922a-da18b23dab1b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-plugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.880 2 DEBUG oslo_concurrency.lockutils [req-9dad49f0-1fc6-4d7d-bb99-a34926f29c9a req-f9c7b3f3-cbf9-4ed2-922a-da18b23dab1b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.880 2 DEBUG oslo_concurrency.lockutils [req-9dad49f0-1fc6-4d7d-bb99-a34926f29c9a req-f9c7b3f3-cbf9-4ed2-922a-da18b23dab1b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.881 2 DEBUG oslo_concurrency.lockutils [req-9dad49f0-1fc6-4d7d-bb99-a34926f29c9a req-f9c7b3f3-cbf9-4ed2-922a-da18b23dab1b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.881 2 DEBUG nova.compute.manager [req-9dad49f0-1fc6-4d7d-bb99-a34926f29c9a req-f9c7b3f3-cbf9-4ed2-922a-da18b23dab1b 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Processing event network-vif-plugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:07:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:38.975 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:07:38 compute-0 nova_compute[192716]: 2025-10-07 22:07:38.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.107 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.111 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.115 2 INFO nova.virt.libvirt.driver [-] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Instance spawned successfully.
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.116 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 22:07:39 compute-0 podman[222579]: 2025-10-07 22:07:39.205390804 +0000 UTC m=+0.120634960 container create b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:07:39 compute-0 podman[222579]: 2025-10-07 22:07:39.118228685 +0000 UTC m=+0.033472851 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:07:39 compute-0 systemd[1]: Started libpod-conmon-b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769.scope.
Oct 07 22:07:39 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:07:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f27375bb6dd6ded02ee55156c1fe977c82094f9010d0447edca3b8c7567db1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:07:39 compute-0 podman[222579]: 2025-10-07 22:07:39.312270029 +0000 UTC m=+0.227514215 container init b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 07 22:07:39 compute-0 podman[222579]: 2025-10-07 22:07:39.323654115 +0000 UTC m=+0.238898271 container start b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:07:39 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [NOTICE]   (222596) : New worker (222601) forked
Oct 07 22:07:39 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [NOTICE]   (222596) : Loading success.
Oct 07 22:07:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:39.401 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.647 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.668 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.670 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.673 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.674 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.674 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.675 2 DEBUG nova.virt.libvirt.driver [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.750 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.751 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:39 compute-0 nova_compute[192716]: 2025-10-07 22:07:39.847 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.016 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.017 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.049 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.049 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5794MB free_disk=73.30289840698242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.050 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.051 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.190 2 INFO nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Took 14.36 seconds to spawn the instance on the hypervisor.
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.191 2 DEBUG nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:07:40 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:07:40.403 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.728 2 INFO nova.compute.manager [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Took 19.58 seconds to build instance.
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.955 2 DEBUG nova.compute.manager [req-8f8bbdc6-9891-49c1-a447-a5969a9a5a0f req-af888c7a-c4b1-48d5-8279-ef0974745e73 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-plugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.956 2 DEBUG oslo_concurrency.lockutils [req-8f8bbdc6-9891-49c1-a447-a5969a9a5a0f req-af888c7a-c4b1-48d5-8279-ef0974745e73 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.956 2 DEBUG oslo_concurrency.lockutils [req-8f8bbdc6-9891-49c1-a447-a5969a9a5a0f req-af888c7a-c4b1-48d5-8279-ef0974745e73 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.957 2 DEBUG oslo_concurrency.lockutils [req-8f8bbdc6-9891-49c1-a447-a5969a9a5a0f req-af888c7a-c4b1-48d5-8279-ef0974745e73 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.957 2 DEBUG nova.compute.manager [req-8f8bbdc6-9891-49c1-a447-a5969a9a5a0f req-af888c7a-c4b1-48d5-8279-ef0974745e73 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] No waiting events found dispatching network-vif-plugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:07:40 compute-0 nova_compute[192716]: 2025-10-07 22:07:40.957 2 WARNING nova.compute.manager [req-8f8bbdc6-9891-49c1-a447-a5969a9a5a0f req-af888c7a-c4b1-48d5-8279-ef0974745e73 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received unexpected event network-vif-plugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 for instance with vm_state active and task_state None.
Oct 07 22:07:41 compute-0 nova_compute[192716]: 2025-10-07 22:07:41.237 2 DEBUG oslo_concurrency.lockutils [None req-93b52f5a-cd6c-4005-b94b-6181e1b7e764 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:41 compute-0 nova_compute[192716]: 2025-10-07 22:07:41.276 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 3158f0ab-25fe-4a1a-8c95-8d9b702e260b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:07:41 compute-0 nova_compute[192716]: 2025-10-07 22:07:41.277 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:07:41 compute-0 nova_compute[192716]: 2025-10-07 22:07:41.277 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:07:40 up  1:16,  0 user,  load average: 0.09, 0.16, 0.25\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_faa7f94deef04b67982eaf47a775c225': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:07:41 compute-0 nova_compute[192716]: 2025-10-07 22:07:41.316 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:07:41 compute-0 nova_compute[192716]: 2025-10-07 22:07:41.823 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:07:42 compute-0 nova_compute[192716]: 2025-10-07 22:07:42.332 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:07:42 compute-0 nova_compute[192716]: 2025-10-07 22:07:42.334 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.283s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:07:43 compute-0 nova_compute[192716]: 2025-10-07 22:07:43.335 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:43 compute-0 nova_compute[192716]: 2025-10-07 22:07:43.850 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:43 compute-0 nova_compute[192716]: 2025-10-07 22:07:43.851 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:43 compute-0 nova_compute[192716]: 2025-10-07 22:07:43.852 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:43 compute-0 nova_compute[192716]: 2025-10-07 22:07:43.852 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:07:45 compute-0 nova_compute[192716]: 2025-10-07 22:07:45.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:45 compute-0 nova_compute[192716]: 2025-10-07 22:07:45.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:45 compute-0 podman[222618]: 2025-10-07 22:07:45.915392044 +0000 UTC m=+0.141260881 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Oct 07 22:07:47 compute-0 podman[222644]: 2025-10-07 22:07:47.854575432 +0000 UTC m=+0.083508815 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest)
Oct 07 22:07:50 compute-0 nova_compute[192716]: 2025-10-07 22:07:50.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:50 compute-0 nova_compute[192716]: 2025-10-07 22:07:50.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:50 compute-0 ovn_controller[94904]: 2025-10-07T22:07:50Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:70:80:f5 10.100.0.7
Oct 07 22:07:50 compute-0 ovn_controller[94904]: 2025-10-07T22:07:50Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:70:80:f5 10.100.0.7
Oct 07 22:07:51 compute-0 podman[222679]: 2025-10-07 22:07:51.844083443 +0000 UTC m=+0.081979552 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 22:07:53 compute-0 nova_compute[192716]: 2025-10-07 22:07:53.070 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Creating tmpfile /var/lib/nova/instances/tmplxv2by01 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:07:53 compute-0 nova_compute[192716]: 2025-10-07 22:07:53.072 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:53 compute-0 nova_compute[192716]: 2025-10-07 22:07:53.078 2 DEBUG nova.compute.manager [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplxv2by01',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:07:55 compute-0 nova_compute[192716]: 2025-10-07 22:07:55.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:55 compute-0 nova_compute[192716]: 2025-10-07 22:07:55.122 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:07:55 compute-0 nova_compute[192716]: 2025-10-07 22:07:55.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:07:59 compute-0 podman[203153]: time="2025-10-07T22:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:07:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:07:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3483 "" "Go-http-client/1.1"
Oct 07 22:08:00 compute-0 nova_compute[192716]: 2025-10-07 22:08:00.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:00 compute-0 nova_compute[192716]: 2025-10-07 22:08:00.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:00 compute-0 nova_compute[192716]: 2025-10-07 22:08:00.777 2 DEBUG nova.compute.manager [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplxv2by01',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6e57e1-3513-4376-b92a-8cbe948d8ec3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: ERROR   22:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: ERROR   22:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: ERROR   22:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: ERROR   22:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: ERROR   22:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:08:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:08:01 compute-0 nova_compute[192716]: 2025-10-07 22:08:01.793 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-1c6e57e1-3513-4376-b92a-8cbe948d8ec3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:08:01 compute-0 nova_compute[192716]: 2025-10-07 22:08:01.794 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-1c6e57e1-3513-4376-b92a-8cbe948d8ec3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:08:01 compute-0 nova_compute[192716]: 2025-10-07 22:08:01.794 2 DEBUG nova.network.neutron [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:08:02 compute-0 nova_compute[192716]: 2025-10-07 22:08:02.303 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:04 compute-0 nova_compute[192716]: 2025-10-07 22:08:04.304 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:04 compute-0 nova_compute[192716]: 2025-10-07 22:08:04.850 2 DEBUG nova.network.neutron [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Updating instance_info_cache with network_info: [{"id": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "address": "fa:16:3e:26:cd:64", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dd0148d-97", "ovs_interfaceid": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.356 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-1c6e57e1-3513-4376-b92a-8cbe948d8ec3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.373 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplxv2by01',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6e57e1-3513-4376-b92a-8cbe948d8ec3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.374 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Creating instance directory: /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.375 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Creating disk.info with the contents: {'/var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk': 'qcow2', '/var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.375 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.376 2 DEBUG nova.objects.instance [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1c6e57e1-3513-4376-b92a-8cbe948d8ec3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:08:05 compute-0 podman[222701]: 2025-10-07 22:08:05.867979507 +0000 UTC m=+0.096280022 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2)
Oct 07 22:08:05 compute-0 podman[222702]: 2025-10-07 22:08:05.876035467 +0000 UTC m=+0.094758447 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251007)
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.883 2 DEBUG oslo_utils.imageutils.format_inspector [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.886 2 DEBUG oslo_utils.imageutils.format_inspector [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.887 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.955 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.957 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.958 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.959 2 DEBUG oslo_utils.imageutils.format_inspector [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.965 2 DEBUG oslo_utils.imageutils.format_inspector [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:08:05 compute-0 nova_compute[192716]: 2025-10-07 22:08:05.966 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.046 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.047 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.097 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.099 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.100 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.167 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.168 2 DEBUG nova.virt.disk.api [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.169 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.237 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.238 2 DEBUG nova.virt.disk.api [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.239 2 DEBUG nova.objects.instance [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c6e57e1-3513-4376-b92a-8cbe948d8ec3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.747 2 DEBUG nova.objects.base [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<1c6e57e1-3513-4376-b92a-8cbe948d8ec3> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.747 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.790 2 DEBUG oslo_concurrency.processutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3/disk.config 497664" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.791 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.794 2 DEBUG nova.virt.libvirt.vif [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:06:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1648981262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-164',id=18,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:07:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-7x3varr4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:07:14Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=1c6e57e1-3513-4376-b92a-8cbe948d8ec3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "address": "fa:16:3e:26:cd:64", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2dd0148d-97", "ovs_interfaceid": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.794 2 DEBUG nova.network.os_vif_util [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "address": "fa:16:3e:26:cd:64", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2dd0148d-97", "ovs_interfaceid": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.796 2 DEBUG nova.network.os_vif_util [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:cd:64,bridge_name='br-int',has_traffic_filtering=True,id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dd0148d-97') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.797 2 DEBUG os_vif [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:cd:64,bridge_name='br-int',has_traffic_filtering=True,id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dd0148d-97') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.799 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.799 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.801 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3c4d8f5f-a51a-590a-a71d-2e14c3fc9102', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.811 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd0148d-97, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.811 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2dd0148d-97, col_values=(('qos', UUID('da1d3029-2faf-47aa-9311-1dd9c20c2450')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.812 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2dd0148d-97, col_values=(('external_ids', {'iface-id': '2dd0148d-97f4-4aff-a43b-54bd6cd5a349', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:cd:64', 'vm-uuid': '1c6e57e1-3513-4376-b92a-8cbe948d8ec3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:06 compute-0 NetworkManager[51722]: <info>  [1759874886.8144] manager: (tap2dd0148d-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.825 2 INFO os_vif [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:cd:64,bridge_name='br-int',has_traffic_filtering=True,id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dd0148d-97')
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.826 2 DEBUG nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.826 2 DEBUG nova.compute.manager [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplxv2by01',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6e57e1-3513-4376-b92a-8cbe948d8ec3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:08:06 compute-0 nova_compute[192716]: 2025-10-07 22:08:06.827 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:07 compute-0 nova_compute[192716]: 2025-10-07 22:08:07.139 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:07 compute-0 nova_compute[192716]: 2025-10-07 22:08:07.880 2 DEBUG nova.network.neutron [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Port 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:08:07 compute-0 nova_compute[192716]: 2025-10-07 22:08:07.896 2 DEBUG nova.compute.manager [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplxv2by01',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6e57e1-3513-4376-b92a-8cbe948d8ec3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:08:08 compute-0 ovn_controller[94904]: 2025-10-07T22:08:08Z|00177|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 07 22:08:08 compute-0 podman[222762]: 2025-10-07 22:08:08.818037946 +0000 UTC m=+0.058992592 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:08:10 compute-0 nova_compute[192716]: 2025-10-07 22:08:10.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:11 compute-0 NetworkManager[51722]: <info>  [1759874891.3598] manager: (tap2dd0148d-97): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Oct 07 22:08:11 compute-0 kernel: tap2dd0148d-97: entered promiscuous mode
Oct 07 22:08:11 compute-0 nova_compute[192716]: 2025-10-07 22:08:11.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:11 compute-0 ovn_controller[94904]: 2025-10-07T22:08:11Z|00178|binding|INFO|Claiming lport 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 for this additional chassis.
Oct 07 22:08:11 compute-0 ovn_controller[94904]: 2025-10-07T22:08:11Z|00179|binding|INFO|2dd0148d-97f4-4aff-a43b-54bd6cd5a349: Claiming fa:16:3e:26:cd:64 10.100.0.11
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.375 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:cd:64 10.100.0.11'], port_security=['fa:16:3e:26:cd:64 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1c6e57e1-3513-4376-b92a-8cbe948d8ec3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=2dd0148d-97f4-4aff-a43b-54bd6cd5a349) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.376 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 unbound from our chassis
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.378 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:08:11 compute-0 ovn_controller[94904]: 2025-10-07T22:08:11Z|00180|binding|INFO|Setting lport 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 ovn-installed in OVS
Oct 07 22:08:11 compute-0 nova_compute[192716]: 2025-10-07 22:08:11.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:11 compute-0 nova_compute[192716]: 2025-10-07 22:08:11.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.404 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[60be30b4-0df0-4ef4-8ca9-362c72e6a9a0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 systemd-udevd[222802]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:08:11 compute-0 systemd-machined[152719]: New machine qemu-15-instance-00000012.
Oct 07 22:08:11 compute-0 NetworkManager[51722]: <info>  [1759874891.4212] device (tap2dd0148d-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:08:11 compute-0 NetworkManager[51722]: <info>  [1759874891.4221] device (tap2dd0148d-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:08:11 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000012.
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.456 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[fb804a67-beec-4619-af1d-9663a7ed982c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.459 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[add0a905-12d3-46f4-9026-84922b7322cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.493 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[94b68f28-11fc-4d05-9622-c65f7ff14d09]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.511 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc9bbe6-a680-4fcc-980b-e2084e5f7b61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459202, 'reachable_time': 23745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222815, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.535 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[68e94bdc-6244-4190-81d0-c808330bfa8d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459214, 'tstamp': 459214}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222816, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459217, 'tstamp': 459217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222816, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.537 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:11 compute-0 nova_compute[192716]: 2025-10-07 22:08:11.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:11 compute-0 nova_compute[192716]: 2025-10-07 22:08:11.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.540 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f17307e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.540 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.540 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f17307e-a0, col_values=(('external_ids', {'iface-id': '6865dbad-0588-4cfd-9a22-08a49ea1d5a5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.541 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:08:11 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:11.542 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[79fd0628-0cf9-43ed-91cb-362ca9f71039]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-7f17307e-ac72-4a6f-8a05-ba2eca705379\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 7f17307e-ac72-4a6f-8a05-ba2eca705379\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:11 compute-0 nova_compute[192716]: 2025-10-07 22:08:11.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:14 compute-0 ovn_controller[94904]: 2025-10-07T22:08:14Z|00181|binding|INFO|Claiming lport 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 for this chassis.
Oct 07 22:08:14 compute-0 ovn_controller[94904]: 2025-10-07T22:08:14Z|00182|binding|INFO|2dd0148d-97f4-4aff-a43b-54bd6cd5a349: Claiming fa:16:3e:26:cd:64 10.100.0.11
Oct 07 22:08:14 compute-0 ovn_controller[94904]: 2025-10-07T22:08:14Z|00183|binding|INFO|Setting lport 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 up in Southbound
Oct 07 22:08:15 compute-0 nova_compute[192716]: 2025-10-07 22:08:15.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:15 compute-0 nova_compute[192716]: 2025-10-07 22:08:15.615 2 INFO nova.compute.manager [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Post operation of migration started
Oct 07 22:08:15 compute-0 nova_compute[192716]: 2025-10-07 22:08:15.616 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.182 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.183 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.338 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-1c6e57e1-3513-4376-b92a-8cbe948d8ec3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.339 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-1c6e57e1-3513-4376-b92a-8cbe948d8ec3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.339 2 DEBUG nova.network.neutron [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:16 compute-0 nova_compute[192716]: 2025-10-07 22:08:16.849 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:16 compute-0 podman[222835]: 2025-10-07 22:08:16.887158172 +0000 UTC m=+0.115352217 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 07 22:08:17 compute-0 nova_compute[192716]: 2025-10-07 22:08:17.563 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:17 compute-0 nova_compute[192716]: 2025-10-07 22:08:17.747 2 DEBUG nova.network.neutron [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Updating instance_info_cache with network_info: [{"id": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "address": "fa:16:3e:26:cd:64", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dd0148d-97", "ovs_interfaceid": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:08:18 compute-0 nova_compute[192716]: 2025-10-07 22:08:18.253 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-1c6e57e1-3513-4376-b92a-8cbe948d8ec3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:08:18 compute-0 nova_compute[192716]: 2025-10-07 22:08:18.772 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:18 compute-0 nova_compute[192716]: 2025-10-07 22:08:18.773 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:18 compute-0 nova_compute[192716]: 2025-10-07 22:08:18.773 2 DEBUG oslo_concurrency.lockutils [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:18 compute-0 nova_compute[192716]: 2025-10-07 22:08:18.781 2 INFO nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:08:18 compute-0 virtqemud[192532]: Domain id=15 name='instance-00000012' uuid=1c6e57e1-3513-4376-b92a-8cbe948d8ec3 is tainted: custom-monitor
Oct 07 22:08:18 compute-0 podman[222862]: 2025-10-07 22:08:18.838707106 +0000 UTC m=+0.061620428 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4)
Oct 07 22:08:19 compute-0 nova_compute[192716]: 2025-10-07 22:08:19.791 2 INFO nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:08:20 compute-0 nova_compute[192716]: 2025-10-07 22:08:20.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:20 compute-0 nova_compute[192716]: 2025-10-07 22:08:20.800 2 INFO nova.virt.libvirt.driver [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:08:20 compute-0 nova_compute[192716]: 2025-10-07 22:08:20.807 2 DEBUG nova.compute.manager [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:08:21 compute-0 nova_compute[192716]: 2025-10-07 22:08:21.319 2 DEBUG nova.objects.instance [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:08:21 compute-0 nova_compute[192716]: 2025-10-07 22:08:21.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:22 compute-0 nova_compute[192716]: 2025-10-07 22:08:22.336 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:22 compute-0 nova_compute[192716]: 2025-10-07 22:08:22.418 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:22 compute-0 nova_compute[192716]: 2025-10-07 22:08:22.419 2 WARNING neutronclient.v2_0.client [None req-937401a1-0909-4103-be6d-40ee9bc1dc8b 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:22 compute-0 podman[222880]: 2025-10-07 22:08:22.830294987 +0000 UTC m=+0.072393867 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct 07 22:08:25 compute-0 nova_compute[192716]: 2025-10-07 22:08:25.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:25.639 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:25.639 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:25.641 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:26 compute-0 nova_compute[192716]: 2025-10-07 22:08:26.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.236 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.236 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.236 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.236 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.237 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.255 2 INFO nova.compute.manager [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Terminating instance
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.779 2 DEBUG nova.compute.manager [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:08:27 compute-0 kernel: tapdae0547a-45 (unregistering): left promiscuous mode
Oct 07 22:08:27 compute-0 NetworkManager[51722]: <info>  [1759874907.8135] device (tapdae0547a-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:08:27 compute-0 ovn_controller[94904]: 2025-10-07T22:08:27Z|00184|binding|INFO|Releasing lport dae0547a-45c6-4b1f-bd90-f18af339dcb3 from this chassis (sb_readonly=0)
Oct 07 22:08:27 compute-0 ovn_controller[94904]: 2025-10-07T22:08:27Z|00185|binding|INFO|Setting lport dae0547a-45c6-4b1f-bd90-f18af339dcb3 down in Southbound
Oct 07 22:08:27 compute-0 ovn_controller[94904]: 2025-10-07T22:08:27Z|00186|binding|INFO|Removing iface tapdae0547a-45 ovn-installed in OVS
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.832 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:80:f5 10.100.0.7'], port_security=['fa:16:3e:70:80:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3158f0ab-25fe-4a1a-8c95-8d9b702e260b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=dae0547a-45c6-4b1f-bd90-f18af339dcb3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.834 103791 INFO neutron.agent.ovn.metadata.agent [-] Port dae0547a-45c6-4b1f-bd90-f18af339dcb3 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 unbound from our chassis
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.835 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f17307e-ac72-4a6f-8a05-ba2eca705379
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.855 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4c9d1c-83a6-4683-9f80-dd313edd9530]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.893 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[7e344dab-295c-4ba6-a703-2b5cebae169d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.897 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9cebc951-13fd-4f90-a6e4-ed9975255005]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:27 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Oct 07 22:08:27 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 13.640s CPU time.
Oct 07 22:08:27 compute-0 systemd-machined[152719]: Machine qemu-14-instance-00000013 terminated.
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.939 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e4a843-5215-402f-9db2-349a75477990]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.963 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba60ae1-2a75-47c3-9431-a7d6c3f38fa7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f17307e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:d3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459202, 'reachable_time': 23745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222915, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.986 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[54223bc4-9e6e-4bb1-987c-4d36ef4d628f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459214, 'tstamp': 459214}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222916, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7f17307e-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459217, 'tstamp': 459217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222916, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.988 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:27 compute-0 nova_compute[192716]: 2025-10-07 22:08:27.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.997 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f17307e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.998 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.998 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f17307e-a0, col_values=(('external_ids', {'iface-id': '6865dbad-0588-4cfd-9a22-08a49ea1d5a5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:27 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:27.999 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:08:28 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:28.001 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b9fb881d-a2dd-48de-af31-c3fe4c9d6a87]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-7f17307e-ac72-4a6f-8a05-ba2eca705379\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 7f17307e-ac72-4a6f-8a05-ba2eca705379\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.017 2 DEBUG nova.compute.manager [req-40cb9139-aeca-466b-af0a-30d7ca239b2c req-ac188084-22ed-488d-9dcb-0a39677c80ef 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-unplugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.017 2 DEBUG oslo_concurrency.lockutils [req-40cb9139-aeca-466b-af0a-30d7ca239b2c req-ac188084-22ed-488d-9dcb-0a39677c80ef 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.017 2 DEBUG oslo_concurrency.lockutils [req-40cb9139-aeca-466b-af0a-30d7ca239b2c req-ac188084-22ed-488d-9dcb-0a39677c80ef 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.017 2 DEBUG oslo_concurrency.lockutils [req-40cb9139-aeca-466b-af0a-30d7ca239b2c req-ac188084-22ed-488d-9dcb-0a39677c80ef 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.017 2 DEBUG nova.compute.manager [req-40cb9139-aeca-466b-af0a-30d7ca239b2c req-ac188084-22ed-488d-9dcb-0a39677c80ef 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] No waiting events found dispatching network-vif-unplugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.018 2 DEBUG nova.compute.manager [req-40cb9139-aeca-466b-af0a-30d7ca239b2c req-ac188084-22ed-488d-9dcb-0a39677c80ef 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-unplugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.043 2 INFO nova.virt.libvirt.driver [-] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Instance destroyed successfully.
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.044 2 DEBUG nova.objects.instance [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lazy-loading 'resources' on Instance uuid 3158f0ab-25fe-4a1a-8c95-8d9b702e260b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.549 2 DEBUG nova.virt.libvirt.vif [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2006630721',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-200',id=19,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:07:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-1zu2dw75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:07:40Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=3158f0ab-25fe-4a1a-8c95-8d9b702e260b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.550 2 DEBUG nova.network.os_vif_util [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "address": "fa:16:3e:70:80:f5", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdae0547a-45", "ovs_interfaceid": "dae0547a-45c6-4b1f-bd90-f18af339dcb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.551 2 DEBUG nova.network.os_vif_util [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.552 2 DEBUG os_vif [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.555 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdae0547a-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.561 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b8fa3d8a-6bbf-4124-a430-a081988c157d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.566 2 INFO os_vif [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:80:f5,bridge_name='br-int',has_traffic_filtering=True,id=dae0547a-45c6-4b1f-bd90-f18af339dcb3,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdae0547a-45')
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.567 2 INFO nova.virt.libvirt.driver [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Deleting instance files /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b_del
Oct 07 22:08:28 compute-0 nova_compute[192716]: 2025-10-07 22:08:28.568 2 INFO nova.virt.libvirt.driver [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Deletion of /var/lib/nova/instances/3158f0ab-25fe-4a1a-8c95-8d9b702e260b_del complete
Oct 07 22:08:29 compute-0 nova_compute[192716]: 2025-10-07 22:08:29.083 2 INFO nova.compute.manager [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Took 1.30 seconds to destroy the instance on the hypervisor.
Oct 07 22:08:29 compute-0 nova_compute[192716]: 2025-10-07 22:08:29.084 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:08:29 compute-0 nova_compute[192716]: 2025-10-07 22:08:29.084 2 DEBUG nova.compute.manager [-] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:08:29 compute-0 nova_compute[192716]: 2025-10-07 22:08:29.084 2 DEBUG nova.network.neutron [-] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:08:29 compute-0 nova_compute[192716]: 2025-10-07 22:08:29.084 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:29 compute-0 podman[203153]: time="2025-10-07T22:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:08:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:08:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3485 "" "Go-http-client/1.1"
Oct 07 22:08:29 compute-0 nova_compute[192716]: 2025-10-07 22:08:29.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.110 2 DEBUG nova.compute.manager [req-41e65c26-578b-4d3b-adad-c16f3ee8d5dc req-42a0da1b-a01c-4818-911e-a5b5bc0fed88 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-unplugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.111 2 DEBUG oslo_concurrency.lockutils [req-41e65c26-578b-4d3b-adad-c16f3ee8d5dc req-42a0da1b-a01c-4818-911e-a5b5bc0fed88 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.111 2 DEBUG oslo_concurrency.lockutils [req-41e65c26-578b-4d3b-adad-c16f3ee8d5dc req-42a0da1b-a01c-4818-911e-a5b5bc0fed88 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.112 2 DEBUG oslo_concurrency.lockutils [req-41e65c26-578b-4d3b-adad-c16f3ee8d5dc req-42a0da1b-a01c-4818-911e-a5b5bc0fed88 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.112 2 DEBUG nova.compute.manager [req-41e65c26-578b-4d3b-adad-c16f3ee8d5dc req-42a0da1b-a01c-4818-911e-a5b5bc0fed88 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] No waiting events found dispatching network-vif-unplugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.113 2 DEBUG nova.compute.manager [req-41e65c26-578b-4d3b-adad-c16f3ee8d5dc req-42a0da1b-a01c-4818-911e-a5b5bc0fed88 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-unplugged-dae0547a-45c6-4b1f-bd90-f18af339dcb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.159 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:30 compute-0 nova_compute[192716]: 2025-10-07 22:08:30.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: ERROR   22:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: ERROR   22:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: ERROR   22:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: ERROR   22:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: ERROR   22:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:08:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:08:31 compute-0 nova_compute[192716]: 2025-10-07 22:08:31.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:31 compute-0 nova_compute[192716]: 2025-10-07 22:08:31.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:08:32 compute-0 nova_compute[192716]: 2025-10-07 22:08:32.236 2 DEBUG nova.compute.manager [req-abfacb28-909f-434a-9d6a-66c16b33a64f req-48ccfd1b-1355-4893-a36c-333df60f3d12 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Received event network-vif-deleted-dae0547a-45c6-4b1f-bd90-f18af339dcb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:08:32 compute-0 nova_compute[192716]: 2025-10-07 22:08:32.236 2 INFO nova.compute.manager [req-abfacb28-909f-434a-9d6a-66c16b33a64f req-48ccfd1b-1355-4893-a36c-333df60f3d12 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Neutron deleted interface dae0547a-45c6-4b1f-bd90-f18af339dcb3; detaching it from the instance and deleting it from the info cache
Oct 07 22:08:32 compute-0 nova_compute[192716]: 2025-10-07 22:08:32.237 2 DEBUG nova.network.neutron [req-abfacb28-909f-434a-9d6a-66c16b33a64f req-48ccfd1b-1355-4893-a36c-333df60f3d12 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:08:32 compute-0 nova_compute[192716]: 2025-10-07 22:08:32.681 2 DEBUG nova.network.neutron [-] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:08:32 compute-0 nova_compute[192716]: 2025-10-07 22:08:32.745 2 DEBUG nova.compute.manager [req-abfacb28-909f-434a-9d6a-66c16b33a64f req-48ccfd1b-1355-4893-a36c-333df60f3d12 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Detach interface failed, port_id=dae0547a-45c6-4b1f-bd90-f18af339dcb3, reason: Instance 3158f0ab-25fe-4a1a-8c95-8d9b702e260b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:08:33 compute-0 nova_compute[192716]: 2025-10-07 22:08:33.189 2 INFO nova.compute.manager [-] [instance: 3158f0ab-25fe-4a1a-8c95-8d9b702e260b] Took 4.11 seconds to deallocate network for instance.
Oct 07 22:08:33 compute-0 nova_compute[192716]: 2025-10-07 22:08:33.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:33 compute-0 nova_compute[192716]: 2025-10-07 22:08:33.716 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:33 compute-0 nova_compute[192716]: 2025-10-07 22:08:33.717 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:33 compute-0 nova_compute[192716]: 2025-10-07 22:08:33.893 2 DEBUG nova.compute.provider_tree [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:08:34 compute-0 nova_compute[192716]: 2025-10-07 22:08:34.402 2 DEBUG nova.scheduler.client.report [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:08:34 compute-0 nova_compute[192716]: 2025-10-07 22:08:34.913 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.196s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:34 compute-0 nova_compute[192716]: 2025-10-07 22:08:34.948 2 INFO nova.scheduler.client.report [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Deleted allocations for instance 3158f0ab-25fe-4a1a-8c95-8d9b702e260b
Oct 07 22:08:35 compute-0 nova_compute[192716]: 2025-10-07 22:08:35.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:35 compute-0 nova_compute[192716]: 2025-10-07 22:08:35.982 2 DEBUG oslo_concurrency.lockutils [None req-4d3a3270-6f5b-41c4-a742-d3318d7858de 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "3158f0ab-25fe-4a1a-8c95-8d9b702e260b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.746s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:35 compute-0 nova_compute[192716]: 2025-10-07 22:08:35.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:36 compute-0 nova_compute[192716]: 2025-10-07 22:08:36.473 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:36 compute-0 nova_compute[192716]: 2025-10-07 22:08:36.474 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:36 compute-0 nova_compute[192716]: 2025-10-07 22:08:36.474 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:36 compute-0 nova_compute[192716]: 2025-10-07 22:08:36.474 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:36 compute-0 nova_compute[192716]: 2025-10-07 22:08:36.475 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:36 compute-0 nova_compute[192716]: 2025-10-07 22:08:36.494 2 INFO nova.compute.manager [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Terminating instance
Oct 07 22:08:36 compute-0 podman[222935]: 2025-10-07 22:08:36.902697472 +0000 UTC m=+0.125580412 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 07 22:08:36 compute-0 podman[222936]: 2025-10-07 22:08:36.904786022 +0000 UTC m=+0.127205778 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.license=GPLv2)
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.009 2 DEBUG nova.compute.manager [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:08:37 compute-0 kernel: tap2dd0148d-97 (unregistering): left promiscuous mode
Oct 07 22:08:37 compute-0 NetworkManager[51722]: <info>  [1759874917.0365] device (tap2dd0148d-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 ovn_controller[94904]: 2025-10-07T22:08:37Z|00187|binding|INFO|Releasing lport 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 from this chassis (sb_readonly=0)
Oct 07 22:08:37 compute-0 ovn_controller[94904]: 2025-10-07T22:08:37Z|00188|binding|INFO|Setting lport 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 down in Southbound
Oct 07 22:08:37 compute-0 ovn_controller[94904]: 2025-10-07T22:08:37Z|00189|binding|INFO|Removing iface tap2dd0148d-97 ovn-installed in OVS
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.059 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:cd:64 10.100.0.11'], port_security=['fa:16:3e:26:cd:64 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1c6e57e1-3513-4376-b92a-8cbe948d8ec3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'faa7f94deef04b67982eaf47a775c225', 'neutron:revision_number': '15', 'neutron:security_group_ids': '6ea0c626-bce8-4d7e-8c0d-f51033bcdaff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1675f3b1-9c7c-4176-8c45-0239d0b298ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=2dd0148d-97f4-4aff-a43b-54bd6cd5a349) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.060 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 2dd0148d-97f4-4aff-a43b-54bd6cd5a349 in datapath 7f17307e-ac72-4a6f-8a05-ba2eca705379 unbound from our chassis
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.061 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f17307e-ac72-4a6f-8a05-ba2eca705379, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.061 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4a669154-148e-4c1e-be29-642702b238da]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.062 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 namespace which is not needed anymore
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Deactivated successfully.
Oct 07 22:08:37 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Consumed 2.647s CPU time.
Oct 07 22:08:37 compute-0 systemd-machined[152719]: Machine qemu-15-instance-00000012 terminated.
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 podman[222997]: 2025-10-07 22:08:37.268356456 +0000 UTC m=+0.057198981 container kill b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 22:08:37 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [NOTICE]   (222596) : haproxy version is 3.0.5-8e879a5
Oct 07 22:08:37 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [NOTICE]   (222596) : path to executable is /usr/sbin/haproxy
Oct 07 22:08:37 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [WARNING]  (222596) : Exiting Master process...
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.271 2 DEBUG nova.compute.manager [req-b885b758-191a-40fd-b425-41fa1809c7cd req-038b02f1-0404-45b2-a26b-ebec09c7dfbb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Received event network-vif-unplugged-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.272 2 DEBUG oslo_concurrency.lockutils [req-b885b758-191a-40fd-b425-41fa1809c7cd req-038b02f1-0404-45b2-a26b-ebec09c7dfbb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.272 2 DEBUG oslo_concurrency.lockutils [req-b885b758-191a-40fd-b425-41fa1809c7cd req-038b02f1-0404-45b2-a26b-ebec09c7dfbb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.272 2 DEBUG oslo_concurrency.lockutils [req-b885b758-191a-40fd-b425-41fa1809c7cd req-038b02f1-0404-45b2-a26b-ebec09c7dfbb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.272 2 DEBUG nova.compute.manager [req-b885b758-191a-40fd-b425-41fa1809c7cd req-038b02f1-0404-45b2-a26b-ebec09c7dfbb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] No waiting events found dispatching network-vif-unplugged-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.273 2 DEBUG nova.compute.manager [req-b885b758-191a-40fd-b425-41fa1809c7cd req-038b02f1-0404-45b2-a26b-ebec09c7dfbb 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Received event network-vif-unplugged-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:08:37 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [ALERT]    (222596) : Current worker (222601) exited with code 143 (Terminated)
Oct 07 22:08:37 compute-0 neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379[222592]: [WARNING]  (222596) : All workers exited. Exiting... (0)
Oct 07 22:08:37 compute-0 systemd[1]: libpod-b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769.scope: Deactivated successfully.
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.283 2 INFO nova.virt.libvirt.driver [-] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Instance destroyed successfully.
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.284 2 DEBUG nova.objects.instance [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lazy-loading 'resources' on Instance uuid 1c6e57e1-3513-4376-b92a-8cbe948d8ec3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:08:37 compute-0 podman[223026]: 2025-10-07 22:08:37.312138261 +0000 UTC m=+0.023066072 container died b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest)
Oct 07 22:08:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769-userdata-shm.mount: Deactivated successfully.
Oct 07 22:08:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-0f27375bb6dd6ded02ee55156c1fe977c82094f9010d0447edca3b8c7567db1f-merged.mount: Deactivated successfully.
Oct 07 22:08:37 compute-0 podman[223026]: 2025-10-07 22:08:37.34699319 +0000 UTC m=+0.057920971 container cleanup b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:08:37 compute-0 systemd[1]: libpod-conmon-b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769.scope: Deactivated successfully.
Oct 07 22:08:37 compute-0 podman[223028]: 2025-10-07 22:08:37.362531836 +0000 UTC m=+0.064071358 container remove b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.380 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1d74df39-e178-4146-9725-63e423bf7ca6]: (4, ("Tue Oct  7 10:08:37 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 (b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769)\nb6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769\nTue Oct  7 10:08:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 (b6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769)\nb6d8ac6b221794506605a1f5c2c4716f523c544d00f809b0d165964803206769\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.382 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[36dde509-a480-472a-bc1c-c5895e18bcf9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.382 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f17307e-ac72-4a6f-8a05-ba2eca705379.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.383 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b641c900-c7c8-484f-bc26-0c59322558f4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.384 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f17307e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 kernel: tap7f17307e-a0: left promiscuous mode
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.405 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[50f8a2b1-3a1b-4a9b-a078-1ebb7ca490b9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.440 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6c118e05-384f-4e31-81cf-fc6a692569ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.441 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[317333bb-57ca-46e0-a40b-c94ee502fe2f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.462 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[de2447b8-456a-4624-93d7-b8e7babed303]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459194, 'reachable_time': 34308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223058, 'error': None, 'target': 'ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.466 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7f17307e-ac72-4a6f-8a05-ba2eca705379 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:08:37 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:37.467 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[beb7dfef-4ba8-40b1-b6f2-85af37572093]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:08:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d7f17307e\x2dac72\x2d4a6f\x2d8a05\x2dba2eca705379.mount: Deactivated successfully.
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.790 2 DEBUG nova.virt.libvirt.vif [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:06:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1648981262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-164',id=18,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:07:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='faa7f94deef04b67982eaf47a775c225',ramdisk_id='',reservation_id='r-7x3varr4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2065444639-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:08:21Z,user_data=None,user_id='641fbca23ed24b428028d3bc567991bf',uuid=1c6e57e1-3513-4376-b92a-8cbe948d8ec3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "address": "fa:16:3e:26:cd:64", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dd0148d-97", "ovs_interfaceid": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.791 2 DEBUG nova.network.os_vif_util [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converting VIF {"id": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "address": "fa:16:3e:26:cd:64", "network": {"id": "7f17307e-ac72-4a6f-8a05-ba2eca705379", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-301862773-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ddfd0140eaa4d5e8d43efda963767d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dd0148d-97", "ovs_interfaceid": "2dd0148d-97f4-4aff-a43b-54bd6cd5a349", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.791 2 DEBUG nova.network.os_vif_util [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:cd:64,bridge_name='br-int',has_traffic_filtering=True,id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dd0148d-97') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.792 2 DEBUG os_vif [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:cd:64,bridge_name='br-int',has_traffic_filtering=True,id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dd0148d-97') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.797 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd0148d-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.802 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=da1d3029-2faf-47aa-9311-1dd9c20c2450) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.805 2 INFO os_vif [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:cd:64,bridge_name='br-int',has_traffic_filtering=True,id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349,network=Network(7f17307e-ac72-4a6f-8a05-ba2eca705379),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dd0148d-97')
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.806 2 INFO nova.virt.libvirt.driver [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Deleting instance files /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3_del
Oct 07 22:08:37 compute-0 nova_compute[192716]: 2025-10-07 22:08:37.807 2 INFO nova.virt.libvirt.driver [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Deletion of /var/lib/nova/instances/1c6e57e1-3513-4376-b92a-8cbe948d8ec3_del complete
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.320 2 INFO nova.compute.manager [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.320 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.321 2 DEBUG nova.compute.manager [-] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.322 2 DEBUG nova.network.neutron [-] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.322 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.750 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:08:38 compute-0 nova_compute[192716]: 2025-10-07 22:08:38.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:39.113 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:39.114 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.343 2 DEBUG nova.compute.manager [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Received event network-vif-unplugged-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.344 2 DEBUG oslo_concurrency.lockutils [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.344 2 DEBUG oslo_concurrency.lockutils [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.345 2 DEBUG oslo_concurrency.lockutils [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.345 2 DEBUG nova.compute.manager [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] No waiting events found dispatching network-vif-unplugged-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.345 2 DEBUG nova.compute.manager [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Received event network-vif-unplugged-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.346 2 DEBUG nova.compute.manager [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Received event network-vif-deleted-2dd0148d-97f4-4aff-a43b-54bd6cd5a349 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.346 2 INFO nova.compute.manager [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Neutron deleted interface 2dd0148d-97f4-4aff-a43b-54bd6cd5a349; detaching it from the instance and deleting it from the info cache
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.347 2 DEBUG nova.network.neutron [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.732 2 DEBUG nova.network.neutron [-] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.855 2 DEBUG nova.compute.manager [req-da56827e-c7b5-4f21-8174-0177bd1f4f90 req-cd5ac617-24dd-4456-bb0c-2cdc8c66801f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Detach interface failed, port_id=2dd0148d-97f4-4aff-a43b-54bd6cd5a349, reason: Instance 1c6e57e1-3513-4376-b92a-8cbe948d8ec3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:08:39 compute-0 podman[223060]: 2025-10-07 22:08:39.857792306 +0000 UTC m=+0.086695547 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:08:39 compute-0 nova_compute[192716]: 2025-10-07 22:08:39.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.239 2 INFO nova.compute.manager [-] [instance: 1c6e57e1-3513-4376-b92a-8cbe948d8ec3] Took 1.92 seconds to deallocate network for instance.
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.506 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.724 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.726 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.750 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.751 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5808MB free_disk=73.3036003112793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.752 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.753 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:40 compute-0 nova_compute[192716]: 2025-10-07 22:08:40.765 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:08:41 compute-0 nova_compute[192716]: 2025-10-07 22:08:41.801 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 1c6e57e1-3513-4376-b92a-8cbe948d8ec3 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Oct 07 22:08:41 compute-0 nova_compute[192716]: 2025-10-07 22:08:41.802 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:08:41 compute-0 nova_compute[192716]: 2025-10-07 22:08:41.802 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:08:40 up  1:17,  0 user,  load average: 0.36, 0.23, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:08:41 compute-0 nova_compute[192716]: 2025-10-07 22:08:41.836 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.343 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.852 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.853 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.853 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 2.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.859 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:42 compute-0 nova_compute[192716]: 2025-10-07 22:08:42.896 2 INFO nova.scheduler.client.report [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Deleted allocations for instance 1c6e57e1-3513-4376-b92a-8cbe948d8ec3
Oct 07 22:08:43 compute-0 nova_compute[192716]: 2025-10-07 22:08:43.929 2 DEBUG oslo_concurrency.lockutils [None req-e1bb14d5-0eb9-4c7a-8896-5eea632c17cd 641fbca23ed24b428028d3bc567991bf faa7f94deef04b67982eaf47a775c225 - - default default] Lock "1c6e57e1-3513-4376-b92a-8cbe948d8ec3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.455s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:08:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:08:45.118 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:08:45 compute-0 nova_compute[192716]: 2025-10-07 22:08:45.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:45 compute-0 nova_compute[192716]: 2025-10-07 22:08:45.855 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:45 compute-0 nova_compute[192716]: 2025-10-07 22:08:45.855 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:45 compute-0 nova_compute[192716]: 2025-10-07 22:08:45.856 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:08:47 compute-0 nova_compute[192716]: 2025-10-07 22:08:47.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:47 compute-0 nova_compute[192716]: 2025-10-07 22:08:47.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:47 compute-0 podman[223088]: 2025-10-07 22:08:47.926293276 +0000 UTC m=+0.160059020 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 22:08:49 compute-0 podman[223114]: 2025-10-07 22:08:49.832164228 +0000 UTC m=+0.059779655 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 22:08:50 compute-0 nova_compute[192716]: 2025-10-07 22:08:50.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:52 compute-0 nova_compute[192716]: 2025-10-07 22:08:52.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:53 compute-0 podman[223134]: 2025-10-07 22:08:53.849017024 +0000 UTC m=+0.085564715 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 07 22:08:55 compute-0 nova_compute[192716]: 2025-10-07 22:08:55.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:57 compute-0 nova_compute[192716]: 2025-10-07 22:08:57.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:08:59 compute-0 podman[203153]: time="2025-10-07T22:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:08:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:08:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3009 "" "Go-http-client/1.1"
Oct 07 22:09:00 compute-0 nova_compute[192716]: 2025-10-07 22:09:00.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:00.716 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b9:b4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6006393ea657476389ab742b0f55b598', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc4b6ec6-4880-401c-a20b-f966847f0277, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=774ed54e-b9ff-4315-b52d-fe6136fddde8) old=Port_Binding(mac=['fa:16:3e:01:b9:b4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6006393ea657476389ab742b0f55b598', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:09:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:00.717 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 774ed54e-b9ff-4315-b52d-fe6136fddde8 in datapath ea3a75de-7deb-4587-bd4b-e492c51c608d updated
Oct 07 22:09:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:00.718 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea3a75de-7deb-4587-bd4b-e492c51c608d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:09:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:00.719 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[640b781e-3ec5-48e3-bad6-0e61b9bf6ebb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: ERROR   22:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: ERROR   22:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: ERROR   22:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: ERROR   22:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: ERROR   22:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:09:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:09:02 compute-0 nova_compute[192716]: 2025-10-07 22:09:02.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:05 compute-0 nova_compute[192716]: 2025-10-07 22:09:05.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:07 compute-0 podman[223156]: 2025-10-07 22:09:07.829677977 +0000 UTC m=+0.074081005 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:09:07 compute-0 podman[223157]: 2025-10-07 22:09:07.86498832 +0000 UTC m=+0.097612230 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Oct 07 22:09:07 compute-0 nova_compute[192716]: 2025-10-07 22:09:07.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:08.406 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:e7:c5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-840ea698-9029-48f3-9b51-8367123dbb90', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-840ea698-9029-48f3-9b51-8367123dbb90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '571b320e0e5e447fa64ebcac1ce7ec0d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb8e492d-55b8-4feb-a9b4-d3c585cd9665, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d0e7cb92-f417-4278-8768-cb5b12f81cd7) old=Port_Binding(mac=['fa:16:3e:62:e7:c5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-840ea698-9029-48f3-9b51-8367123dbb90', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-840ea698-9029-48f3-9b51-8367123dbb90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '571b320e0e5e447fa64ebcac1ce7ec0d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:09:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:08.408 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d0e7cb92-f417-4278-8768-cb5b12f81cd7 in datapath 840ea698-9029-48f3-9b51-8367123dbb90 updated
Oct 07 22:09:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:08.409 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 840ea698-9029-48f3-9b51-8367123dbb90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:09:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:08.410 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a61a4fad-defe-46da-8943-a7669556c7a2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:10 compute-0 nova_compute[192716]: 2025-10-07 22:09:10.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:10 compute-0 podman[223196]: 2025-10-07 22:09:10.844067901 +0000 UTC m=+0.075360181 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:09:12 compute-0 nova_compute[192716]: 2025-10-07 22:09:12.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:15 compute-0 nova_compute[192716]: 2025-10-07 22:09:15.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:17 compute-0 nova_compute[192716]: 2025-10-07 22:09:17.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:18 compute-0 podman[223222]: 2025-10-07 22:09:18.889938852 +0000 UTC m=+0.130642227 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct 07 22:09:20 compute-0 nova_compute[192716]: 2025-10-07 22:09:20.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:20 compute-0 ovn_controller[94904]: 2025-10-07T22:09:20Z|00190|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 07 22:09:20 compute-0 podman[223250]: 2025-10-07 22:09:20.825169427 +0000 UTC m=+0.050067497 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:09:22 compute-0 nova_compute[192716]: 2025-10-07 22:09:22.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:24.678 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:09:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:24.679 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:09:24 compute-0 nova_compute[192716]: 2025-10-07 22:09:24.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:24 compute-0 podman[223272]: 2025-10-07 22:09:24.832006033 +0000 UTC m=+0.070806750 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, name=ubi9-minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Oct 07 22:09:25 compute-0 nova_compute[192716]: 2025-10-07 22:09:25.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:25.642 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:25.643 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:25.643 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:27 compute-0 nova_compute[192716]: 2025-10-07 22:09:27.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:29 compute-0 podman[203153]: time="2025-10-07T22:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:09:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:09:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Oct 07 22:09:30 compute-0 nova_compute[192716]: 2025-10-07 22:09:30.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:30 compute-0 nova_compute[192716]: 2025-10-07 22:09:30.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: ERROR   22:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: ERROR   22:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: ERROR   22:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: ERROR   22:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: ERROR   22:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:09:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:09:32 compute-0 nova_compute[192716]: 2025-10-07 22:09:32.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:32 compute-0 nova_compute[192716]: 2025-10-07 22:09:32.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:32 compute-0 nova_compute[192716]: 2025-10-07 22:09:32.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:09:33 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:33.681 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:35 compute-0 nova_compute[192716]: 2025-10-07 22:09:35.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:36 compute-0 nova_compute[192716]: 2025-10-07 22:09:36.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:37 compute-0 nova_compute[192716]: 2025-10-07 22:09:37.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:38 compute-0 nova_compute[192716]: 2025-10-07 22:09:38.452 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:38 compute-0 nova_compute[192716]: 2025-10-07 22:09:38.453 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:38 compute-0 podman[223294]: 2025-10-07 22:09:38.826971419 +0000 UTC m=+0.071647865 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 07 22:09:38 compute-0 podman[223295]: 2025-10-07 22:09:38.856399463 +0000 UTC m=+0.089811756 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:09:38 compute-0 nova_compute[192716]: 2025-10-07 22:09:38.959 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 22:09:39 compute-0 nova_compute[192716]: 2025-10-07 22:09:39.516 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:39 compute-0 nova_compute[192716]: 2025-10-07 22:09:39.517 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:39 compute-0 nova_compute[192716]: 2025-10-07 22:09:39.525 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 22:09:39 compute-0 nova_compute[192716]: 2025-10-07 22:09:39.526 2 INFO nova.compute.claims [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Claim successful on node compute-0.ctlplane.example.com
Oct 07 22:09:40 compute-0 nova_compute[192716]: 2025-10-07 22:09:40.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:40 compute-0 nova_compute[192716]: 2025-10-07 22:09:40.598 2 DEBUG nova.compute.provider_tree [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:09:40 compute-0 nova_compute[192716]: 2025-10-07 22:09:40.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:41 compute-0 nova_compute[192716]: 2025-10-07 22:09:41.105 2 DEBUG nova.scheduler.client.report [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:09:41 compute-0 nova_compute[192716]: 2025-10-07 22:09:41.615 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:41 compute-0 nova_compute[192716]: 2025-10-07 22:09:41.616 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 22:09:41 compute-0 podman[223329]: 2025-10-07 22:09:41.874265987 +0000 UTC m=+0.103635932 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:09:41 compute-0 nova_compute[192716]: 2025-10-07 22:09:41.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.130 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.130 2 DEBUG nova.network.neutron [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.131 2 WARNING neutronclient.v2_0.client [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.131 2 WARNING neutronclient.v2_0.client [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.502 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.639 2 INFO nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.705 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.707 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.740 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.741 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5810MB free_disk=73.3036003112793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.741 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.742 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:42 compute-0 nova_compute[192716]: 2025-10-07 22:09:42.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:43 compute-0 nova_compute[192716]: 2025-10-07 22:09:43.150 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 22:09:43 compute-0 nova_compute[192716]: 2025-10-07 22:09:43.789 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 325ae6e1-77ba-444e-92cf-79c32803f073 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:09:43 compute-0 nova_compute[192716]: 2025-10-07 22:09:43.789 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:09:43 compute-0 nova_compute[192716]: 2025-10-07 22:09:43.789 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:09:42 up  1:18,  0 user,  load average: 0.12, 0.18, 0.25\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_571b320e0e5e447fa64ebcac1ce7ec0d': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:09:43 compute-0 nova_compute[192716]: 2025-10-07 22:09:43.834 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.172 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.173 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.174 2 INFO nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Creating image(s)
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.174 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.175 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.175 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.176 2 DEBUG oslo_utils.imageutils.format_inspector [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.179 2 DEBUG oslo_utils.imageutils.format_inspector [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.180 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.232 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.233 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.233 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.234 2 DEBUG oslo_utils.imageutils.format_inspector [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.236 2 DEBUG oslo_utils.imageutils.format_inspector [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.237 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.297 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.298 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.340 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.345 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.346 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.346 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.412 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.413 2 DEBUG nova.virt.disk.api [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Checking if we can resize image /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.413 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.465 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.466 2 DEBUG nova.virt.disk.api [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Cannot resize image /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.467 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.467 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Ensure instance console log exists: /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.467 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.468 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.468 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.592 2 DEBUG nova.network.neutron [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Successfully created port: 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.856 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:09:44 compute-0 nova_compute[192716]: 2025-10-07 22:09:44.857 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.182 2 DEBUG nova.network.neutron [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Successfully updated port: 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.258 2 DEBUG nova.compute.manager [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-changed-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.259 2 DEBUG nova.compute.manager [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Refreshing instance network info cache due to event network-changed-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.259 2 DEBUG oslo_concurrency.lockutils [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.259 2 DEBUG oslo_concurrency.lockutils [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.259 2 DEBUG nova.network.neutron [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Refreshing network info cache for port 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.688 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.766 2 WARNING neutronclient.v2_0.client [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:09:45 compute-0 nova_compute[192716]: 2025-10-07 22:09:45.852 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.271 2 DEBUG nova.network.neutron [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.361 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.362 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.420 2 DEBUG nova.network.neutron [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.927 2 DEBUG oslo_concurrency.lockutils [req-57056269-273c-4d06-8a1f-77f59d238026 req-420c094a-eb7a-44d6-9536-6e920b959187 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.929 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquired lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.929 2 DEBUG nova.network.neutron [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:09:46 compute-0 nova_compute[192716]: 2025-10-07 22:09:46.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:09:47 compute-0 nova_compute[192716]: 2025-10-07 22:09:47.918 2 DEBUG nova.network.neutron [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:09:47 compute-0 nova_compute[192716]: 2025-10-07 22:09:47.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.154 2 WARNING neutronclient.v2_0.client [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.302 2 DEBUG nova.network.neutron [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Updating instance_info_cache with network_info: [{"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.808 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Releasing lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.809 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Instance network_info: |[{"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.813 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Start _get_guest_xml network_info=[{"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.819 2 WARNING nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.821 2 DEBUG nova.virt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154', uuid='325ae6e1-77ba-444e-92cf-79c32803f073'), owner=OwnerMeta(userid='65d5e89c36a04afaa9a8bf3d1033a4f5', username='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin', projectid='571b320e0e5e447fa64ebcac1ce7ec0d', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759874988.821293) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.826 2 DEBUG nova.virt.libvirt.host [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.827 2 DEBUG nova.virt.libvirt.host [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.831 2 DEBUG nova.virt.libvirt.host [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.831 2 DEBUG nova.virt.libvirt.host [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.832 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.832 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.833 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.834 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.834 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.835 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.835 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.835 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.836 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.836 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.837 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.837 2 DEBUG nova.virt.hardware [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.844 2 DEBUG nova.virt.libvirt.vif [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:09:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1186382154',id=21,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='571b320e0e5e447fa64ebcac1ce7ec0d',ramdisk_id='',reservation_id='r-43pjucrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:09:43Z,user_data=None,user_id='65d5e89c36a04afaa9a8bf3d1033a4f5',uuid=325ae6e1-77ba-444e-92cf-79c32803f073,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.844 2 DEBUG nova.network.os_vif_util [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Converting VIF {"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.846 2 DEBUG nova.network.os_vif_util [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:09:48 compute-0 nova_compute[192716]: 2025-10-07 22:09:48.847 2 DEBUG nova.objects.instance [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lazy-loading 'pci_devices' on Instance uuid 325ae6e1-77ba-444e-92cf-79c32803f073 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.365 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] End _get_guest_xml xml=<domain type="kvm">
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <uuid>325ae6e1-77ba-444e-92cf-79c32803f073</uuid>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <name>instance-00000015</name>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154</nova:name>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:09:48</nova:creationTime>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:09:49 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:09:49 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:user uuid="65d5e89c36a04afaa9a8bf3d1033a4f5">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin</nova:user>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:project uuid="571b320e0e5e447fa64ebcac1ce7ec0d">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930</nova:project>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         <nova:port uuid="9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8">
Oct 07 22:09:49 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <system>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <entry name="serial">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <entry name="uuid">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </system>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <os>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </os>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <features>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </features>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:e8:d6:ff"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <target dev="tap9f555d6e-b6"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </interface>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <video>
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </video>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:09:49 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:09:49 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:09:49 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:09:49 compute-0 nova_compute[192716]: </domain>
Oct 07 22:09:49 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.367 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Preparing to wait for external event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.368 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.368 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.368 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.369 2 DEBUG nova.virt.libvirt.vif [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:09:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1186382154',id=21,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='571b320e0e5e447fa64ebcac1ce7ec0d',ramdisk_id='',reservation_id='r-43pjucrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:09:43Z,user_data=None,user_id='65d5e89c36a04afaa9a8bf3d1033a4f5',uuid=325ae6e1-77ba-444e-92cf-79c32803f073,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.370 2 DEBUG nova.network.os_vif_util [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Converting VIF {"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.371 2 DEBUG nova.network.os_vif_util [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.371 2 DEBUG os_vif [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.375 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5e0d3ebd-2b14-5b62-836f-f2bc63811743', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.381 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f555d6e-b6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9f555d6e-b6, col_values=(('qos', UUID('2f79641c-f6b8-480c-8558-b4f74f816b28')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9f555d6e-b6, col_values=(('external_ids', {'iface-id': '9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:d6:ff', 'vm-uuid': '325ae6e1-77ba-444e-92cf-79c32803f073'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 NetworkManager[51722]: <info>  [1759874989.3858] manager: (tap9f555d6e-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:49 compute-0 nova_compute[192716]: 2025-10-07 22:09:49.393 2 INFO os_vif [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6')
Oct 07 22:09:49 compute-0 podman[223372]: 2025-10-07 22:09:49.916315587 +0000 UTC m=+0.151535815 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 22:09:50 compute-0 nova_compute[192716]: 2025-10-07 22:09:50.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:50 compute-0 nova_compute[192716]: 2025-10-07 22:09:50.970 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:09:50 compute-0 nova_compute[192716]: 2025-10-07 22:09:50.970 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:09:50 compute-0 nova_compute[192716]: 2025-10-07 22:09:50.971 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] No VIF found with MAC fa:16:3e:e8:d6:ff, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 22:09:50 compute-0 nova_compute[192716]: 2025-10-07 22:09:50.972 2 INFO nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Using config drive
Oct 07 22:09:51 compute-0 nova_compute[192716]: 2025-10-07 22:09:51.488 2 WARNING neutronclient.v2_0.client [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:09:51 compute-0 podman[223400]: 2025-10-07 22:09:51.841129013 +0000 UTC m=+0.074992081 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.330 2 INFO nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Creating config drive at /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.337 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmp548io6ef execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.481 2 DEBUG oslo_concurrency.processutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmp548io6ef" returned: 0 in 0.144s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:09:52 compute-0 kernel: tap9f555d6e-b6: entered promiscuous mode
Oct 07 22:09:52 compute-0 NetworkManager[51722]: <info>  [1759874992.5589] manager: (tap9f555d6e-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Oct 07 22:09:52 compute-0 ovn_controller[94904]: 2025-10-07T22:09:52Z|00191|binding|INFO|Claiming lport 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for this chassis.
Oct 07 22:09:52 compute-0 ovn_controller[94904]: 2025-10-07T22:09:52Z|00192|binding|INFO|9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8: Claiming fa:16:3e:e8:d6:ff 10.100.0.3
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.578 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:d6:ff 10.100.0.3'], port_security=['fa:16:3e:e8:d6:ff 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '325ae6e1-77ba-444e-92cf-79c32803f073', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '571b320e0e5e447fa64ebcac1ce7ec0d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '60ca1ecb-7714-4c2f-91a8-8375ff264bfb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc4b6ec6-4880-401c-a20b-f966847f0277, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.580 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 in datapath ea3a75de-7deb-4587-bd4b-e492c51c608d bound to our chassis
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.582 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea3a75de-7deb-4587-bd4b-e492c51c608d
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.601 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4e88e6cd-a3c1-4fcc-8b45-77954b35d444]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.602 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea3a75de-71 in ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.606 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea3a75de-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.606 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c56ea8dc-9136-4a83-952c-cdf5a1daf736]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 systemd-udevd[223438]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.608 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d011241c-81a3-4e01-98ac-b2ea031c0a33]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 systemd-machined[152719]: New machine qemu-16-instance-00000015.
Oct 07 22:09:52 compute-0 NetworkManager[51722]: <info>  [1759874992.6283] device (tap9f555d6e-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.627 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[a314ac23-3d65-462d-90ba-0ff54807c66d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 NetworkManager[51722]: <info>  [1759874992.6301] device (tap9f555d6e-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.651 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[09a0c678-4597-4cb6-8e6a-f9544e2e178b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Oct 07 22:09:52 compute-0 ovn_controller[94904]: 2025-10-07T22:09:52Z|00193|binding|INFO|Setting lport 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 ovn-installed in OVS
Oct 07 22:09:52 compute-0 ovn_controller[94904]: 2025-10-07T22:09:52Z|00194|binding|INFO|Setting lport 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 up in Southbound
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.686 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[52ce64bd-5fdd-4940-a3c7-add70ab6d3f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.691 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a42e4094-0e0b-4f22-9fb9-93b6cbc9a3d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 NetworkManager[51722]: <info>  [1759874992.6942] manager: (tapea3a75de-70): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.734 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2be98a8b-aad3-49c6-bca3-57a181794eb8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.738 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[314d3d1d-5428-4a94-a15d-819950e276bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 NetworkManager[51722]: <info>  [1759874992.7756] device (tapea3a75de-70): carrier: link connected
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.785 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b635416c-7075-4614-83f4-bdb03a811373]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.802 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[03897c0a-ca17-438f-8ad8-b3d8c274e696]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea3a75de-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:b9:b4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472635, 'reachable_time': 40962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223471, 'error': None, 'target': 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.818 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[22dea82e-0e5c-43df-8d7f-05122b87491b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe01:b9b4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 472635, 'tstamp': 472635}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223472, 'error': None, 'target': 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.835 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8ea9ee-1ec6-43a2-9e13-039238d58539]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea3a75de-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:b9:b4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472635, 'reachable_time': 40962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223473, 'error': None, 'target': 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.881 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[368c61a9-b730-4547-85a3-7f840db10702]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.965 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bd6f06-6984-434d-84f6-ac9c69bb7081]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.967 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea3a75de-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.967 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.967 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea3a75de-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:52 compute-0 NetworkManager[51722]: <info>  [1759874992.9709] manager: (tapea3a75de-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 kernel: tapea3a75de-70: entered promiscuous mode
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.975 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea3a75de-70, col_values=(('external_ids', {'iface-id': '774ed54e-b9ff-4315-b52d-fe6136fddde8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 ovn_controller[94904]: 2025-10-07T22:09:52Z|00195|binding|INFO|Releasing lport 774ed54e-b9ff-4315-b52d-fe6136fddde8 from this chassis (sb_readonly=0)
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 nova_compute[192716]: 2025-10-07 22:09:52.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.993 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e814dc19-e55e-49b7-aa7b-0a5ef6d7a62c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.994 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.994 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.995 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ea3a75de-7deb-4587-bd4b-e492c51c608d disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.995 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.996 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0c728215-92f5-445b-8f65-4784ea37a266]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.996 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.997 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d351bf04-9720-434f-9aa4-54e519a573f2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.998 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-ea3a75de-7deb-4587-bd4b-e492c51c608d
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID ea3a75de-7deb-4587-bd4b-e492c51c608d
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:09:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:09:52.999 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'env', 'PROCESS_TAG=haproxy-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea3a75de-7deb-4587-bd4b-e492c51c608d.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.359 2 DEBUG nova.compute.manager [req-00a27bc9-1062-4ed4-b818-63f4809074ad req-5a88bbd8-5d9d-4bfb-aac6-58398a8f0a09 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.360 2 DEBUG oslo_concurrency.lockutils [req-00a27bc9-1062-4ed4-b818-63f4809074ad req-5a88bbd8-5d9d-4bfb-aac6-58398a8f0a09 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.361 2 DEBUG oslo_concurrency.lockutils [req-00a27bc9-1062-4ed4-b818-63f4809074ad req-5a88bbd8-5d9d-4bfb-aac6-58398a8f0a09 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.361 2 DEBUG oslo_concurrency.lockutils [req-00a27bc9-1062-4ed4-b818-63f4809074ad req-5a88bbd8-5d9d-4bfb-aac6-58398a8f0a09 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.361 2 DEBUG nova.compute.manager [req-00a27bc9-1062-4ed4-b818-63f4809074ad req-5a88bbd8-5d9d-4bfb-aac6-58398a8f0a09 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Processing event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:09:53 compute-0 podman[223512]: 2025-10-07 22:09:53.450055791 +0000 UTC m=+0.075738152 container create 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:09:53 compute-0 podman[223512]: 2025-10-07 22:09:53.398668419 +0000 UTC m=+0.024350820 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:09:53 compute-0 systemd[1]: Started libpod-conmon-7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b.scope.
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.518 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.523 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.525 2 INFO nova.virt.libvirt.driver [-] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Instance spawned successfully.
Oct 07 22:09:53 compute-0 nova_compute[192716]: 2025-10-07 22:09:53.526 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 22:09:53 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0302266d0ee372b41aac1287014cb33f3ee01cd6444d7ca6e6ca91286db1f19e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:09:53 compute-0 podman[223512]: 2025-10-07 22:09:53.661261376 +0000 UTC m=+0.286943767 container init 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 07 22:09:53 compute-0 podman[223512]: 2025-10-07 22:09:53.667079983 +0000 UTC m=+0.292762344 container start 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 22:09:53 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [NOTICE]   (223531) : New worker (223533) forked
Oct 07 22:09:53 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [NOTICE]   (223531) : Loading success.
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.149 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.149 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.150 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.151 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.151 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.152 2 DEBUG nova.virt.libvirt.driver [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.663 2 INFO nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Took 10.49 seconds to spawn the instance on the hypervisor.
Oct 07 22:09:54 compute-0 nova_compute[192716]: 2025-10-07 22:09:54.664 2 DEBUG nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.193 2 INFO nova.compute.manager [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Took 15.73 seconds to build instance.
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.434 2 DEBUG nova.compute.manager [req-cb18ba62-717f-4c0c-9fab-4ea3641b74db req-c3e68c13-a492-4d96-89b0-cee88172f0f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.435 2 DEBUG oslo_concurrency.lockutils [req-cb18ba62-717f-4c0c-9fab-4ea3641b74db req-c3e68c13-a492-4d96-89b0-cee88172f0f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.435 2 DEBUG oslo_concurrency.lockutils [req-cb18ba62-717f-4c0c-9fab-4ea3641b74db req-c3e68c13-a492-4d96-89b0-cee88172f0f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.436 2 DEBUG oslo_concurrency.lockutils [req-cb18ba62-717f-4c0c-9fab-4ea3641b74db req-c3e68c13-a492-4d96-89b0-cee88172f0f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.436 2 DEBUG nova.compute.manager [req-cb18ba62-717f-4c0c-9fab-4ea3641b74db req-c3e68c13-a492-4d96-89b0-cee88172f0f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.437 2 WARNING nova.compute.manager [req-cb18ba62-717f-4c0c-9fab-4ea3641b74db req-c3e68c13-a492-4d96-89b0-cee88172f0f8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received unexpected event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with vm_state active and task_state None.
Oct 07 22:09:55 compute-0 nova_compute[192716]: 2025-10-07 22:09:55.698 2 DEBUG oslo_concurrency.lockutils [None req-dad93aa7-5bb3-4b83-8bdc-52918b94364c 65d5e89c36a04afaa9a8bf3d1033a4f5 571b320e0e5e447fa64ebcac1ce7ec0d - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.245s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:09:55 compute-0 podman[223542]: 2025-10-07 22:09:55.861571171 +0000 UTC m=+0.090796074 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Oct 07 22:09:58 compute-0 unix_chkpwd[223566]: password check failed for user (root)
Oct 07 22:09:58 compute-0 sshd-session[223564]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 07 22:09:59 compute-0 nova_compute[192716]: 2025-10-07 22:09:59.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:09:59 compute-0 podman[203153]: time="2025-10-07T22:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:09:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:09:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3483 "" "Go-http-client/1.1"
Oct 07 22:10:00 compute-0 nova_compute[192716]: 2025-10-07 22:10:00.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:00 compute-0 sshd-session[223564]: Failed password for root from 80.94.93.119 port 11064 ssh2
Oct 07 22:10:00 compute-0 unix_chkpwd[223567]: password check failed for user (root)
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: ERROR   22:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: ERROR   22:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: ERROR   22:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: ERROR   22:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: ERROR   22:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:10:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:10:02 compute-0 sshd-session[223564]: Failed password for root from 80.94.93.119 port 11064 ssh2
Oct 07 22:10:03 compute-0 unix_chkpwd[223569]: password check failed for user (root)
Oct 07 22:10:04 compute-0 nova_compute[192716]: 2025-10-07 22:10:04.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:04 compute-0 ovn_controller[94904]: 2025-10-07T22:10:04Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:d6:ff 10.100.0.3
Oct 07 22:10:04 compute-0 ovn_controller[94904]: 2025-10-07T22:10:04Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:d6:ff 10.100.0.3
Oct 07 22:10:05 compute-0 nova_compute[192716]: 2025-10-07 22:10:05.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:05 compute-0 sshd-session[223564]: Failed password for root from 80.94.93.119 port 11064 ssh2
Oct 07 22:10:07 compute-0 sshd-session[223564]: Received disconnect from 80.94.93.119 port 11064:11:  [preauth]
Oct 07 22:10:07 compute-0 sshd-session[223564]: Disconnected from authenticating user root 80.94.93.119 port 11064 [preauth]
Oct 07 22:10:07 compute-0 sshd-session[223564]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 07 22:10:08 compute-0 unix_chkpwd[223586]: password check failed for user (root)
Oct 07 22:10:08 compute-0 sshd-session[223584]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 07 22:10:09 compute-0 nova_compute[192716]: 2025-10-07 22:10:09.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:09 compute-0 podman[223587]: 2025-10-07 22:10:09.843907404 +0000 UTC m=+0.074195478 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, container_name=iscsid, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 22:10:09 compute-0 podman[223588]: 2025-10-07 22:10:09.854775655 +0000 UTC m=+0.081028704 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:10:10 compute-0 nova_compute[192716]: 2025-10-07 22:10:10.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:10 compute-0 sshd-session[223584]: Failed password for root from 80.94.93.119 port 10262 ssh2
Oct 07 22:10:12 compute-0 unix_chkpwd[223626]: password check failed for user (root)
Oct 07 22:10:12 compute-0 podman[223627]: 2025-10-07 22:10:12.844815372 +0000 UTC m=+0.075738113 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:10:13 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 22:10:14 compute-0 sshd-session[223584]: Failed password for root from 80.94.93.119 port 10262 ssh2
Oct 07 22:10:14 compute-0 nova_compute[192716]: 2025-10-07 22:10:14.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:14 compute-0 unix_chkpwd[223652]: password check failed for user (root)
Oct 07 22:10:15 compute-0 nova_compute[192716]: 2025-10-07 22:10:15.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:16 compute-0 sshd-session[223584]: Failed password for root from 80.94.93.119 port 10262 ssh2
Oct 07 22:10:18 compute-0 sshd-session[223584]: Received disconnect from 80.94.93.119 port 10262:11:  [preauth]
Oct 07 22:10:18 compute-0 sshd-session[223584]: Disconnected from authenticating user root 80.94.93.119 port 10262 [preauth]
Oct 07 22:10:18 compute-0 sshd-session[223584]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 07 22:10:19 compute-0 nova_compute[192716]: 2025-10-07 22:10:19.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:19 compute-0 unix_chkpwd[223655]: password check failed for user (root)
Oct 07 22:10:19 compute-0 sshd-session[223653]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 07 22:10:20 compute-0 nova_compute[192716]: 2025-10-07 22:10:20.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:20 compute-0 podman[223656]: 2025-10-07 22:10:20.877395971 +0000 UTC m=+0.115977517 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:10:21 compute-0 nova_compute[192716]: 2025-10-07 22:10:21.102 2 DEBUG nova.compute.manager [None req-0df708e1-572a-464c-9bbf-294667725b30 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 07 22:10:21 compute-0 nova_compute[192716]: 2025-10-07 22:10:21.170 2 DEBUG nova.compute.provider_tree [None req-0df708e1-572a-464c-9bbf-294667725b30 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 20 to 24 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 22:10:21 compute-0 sshd-session[223653]: Failed password for root from 80.94.93.119 port 17464 ssh2
Oct 07 22:10:21 compute-0 unix_chkpwd[223682]: password check failed for user (root)
Oct 07 22:10:22 compute-0 podman[223683]: 2025-10-07 22:10:22.820921699 +0000 UTC m=+0.060311873 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 07 22:10:22 compute-0 ovn_controller[94904]: 2025-10-07T22:10:22Z|00196|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 07 22:10:24 compute-0 sshd-session[223653]: Failed password for root from 80.94.93.119 port 17464 ssh2
Oct 07 22:10:24 compute-0 nova_compute[192716]: 2025-10-07 22:10:24.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:25 compute-0 nova_compute[192716]: 2025-10-07 22:10:25.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:25.644 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:25.645 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:25.645 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:25 compute-0 unix_chkpwd[223704]: password check failed for user (root)
Oct 07 22:10:26 compute-0 podman[223705]: 2025-10-07 22:10:26.82300822 +0000 UTC m=+0.065335507 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, vcs-type=git, io.openshift.expose-services=, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Oct 07 22:10:28 compute-0 sshd-session[223653]: Failed password for root from 80.94.93.119 port 17464 ssh2
Oct 07 22:10:29 compute-0 nova_compute[192716]: 2025-10-07 22:10:29.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:29 compute-0 podman[203153]: time="2025-10-07T22:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:10:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:10:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3489 "" "Go-http-client/1.1"
Oct 07 22:10:30 compute-0 nova_compute[192716]: 2025-10-07 22:10:30.144 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Check if temp file /var/lib/nova/instances/tmpk15flo59 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 07 22:10:30 compute-0 sshd-session[223653]: Received disconnect from 80.94.93.119 port 17464:11:  [preauth]
Oct 07 22:10:30 compute-0 sshd-session[223653]: Disconnected from authenticating user root 80.94.93.119 port 17464 [preauth]
Oct 07 22:10:30 compute-0 sshd-session[223653]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 07 22:10:30 compute-0 nova_compute[192716]: 2025-10-07 22:10:30.152 2 DEBUG nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk15flo59',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='325ae6e1-77ba-444e-92cf-79c32803f073',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 07 22:10:30 compute-0 nova_compute[192716]: 2025-10-07 22:10:30.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: ERROR   22:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: ERROR   22:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: ERROR   22:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: ERROR   22:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: ERROR   22:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:10:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:10:31 compute-0 nova_compute[192716]: 2025-10-07 22:10:31.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:32 compute-0 nova_compute[192716]: 2025-10-07 22:10:32.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:32 compute-0 nova_compute[192716]: 2025-10-07 22:10:32.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.812 2 DEBUG oslo_concurrency.processutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.880 2 DEBUG oslo_concurrency.processutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.881 2 DEBUG oslo_concurrency.processutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.941 2 DEBUG oslo_concurrency.processutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.943 2 DEBUG nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Preparing to wait for external event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.944 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.945 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:34 compute-0 nova_compute[192716]: 2025-10-07 22:10:34.945 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:35 compute-0 nova_compute[192716]: 2025-10-07 22:10:35.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:36 compute-0 nova_compute[192716]: 2025-10-07 22:10:36.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:39 compute-0 nova_compute[192716]: 2025-10-07 22:10:39.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.556 2 DEBUG nova.compute.manager [req-c3da167a-7d9b-4c8e-8257-f0d3954637ba req-9d373eb1-1fba-462b-b67d-0d90b1c841f1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.556 2 DEBUG oslo_concurrency.lockutils [req-c3da167a-7d9b-4c8e-8257-f0d3954637ba req-9d373eb1-1fba-462b-b67d-0d90b1c841f1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.556 2 DEBUG oslo_concurrency.lockutils [req-c3da167a-7d9b-4c8e-8257-f0d3954637ba req-9d373eb1-1fba-462b-b67d-0d90b1c841f1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.557 2 DEBUG oslo_concurrency.lockutils [req-c3da167a-7d9b-4c8e-8257-f0d3954637ba req-9d373eb1-1fba-462b-b67d-0d90b1c841f1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.557 2 DEBUG nova.compute.manager [req-c3da167a-7d9b-4c8e-8257-f0d3954637ba req-9d373eb1-1fba-462b-b67d-0d90b1c841f1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No event matching network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 in dict_keys([('network-vif-plugged', '9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.557 2 DEBUG nova.compute.manager [req-c3da167a-7d9b-4c8e-8257-f0d3954637ba req-9d373eb1-1fba-462b-b67d-0d90b1c841f1 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:10:40 compute-0 podman[223734]: 2025-10-07 22:10:40.836016752 +0000 UTC m=+0.070951019 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 22:10:40 compute-0 podman[223733]: 2025-10-07 22:10:40.876362777 +0000 UTC m=+0.105741213 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible)
Oct 07 22:10:40 compute-0 nova_compute[192716]: 2025-10-07 22:10:40.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:41 compute-0 nova_compute[192716]: 2025-10-07 22:10:41.473 2 INFO nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Took 6.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 07 22:10:41 compute-0 nova_compute[192716]: 2025-10-07 22:10:41.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.516 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.517 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.517 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.517 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.623 2 DEBUG nova.compute.manager [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.623 2 DEBUG oslo_concurrency.lockutils [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.623 2 DEBUG oslo_concurrency.lockutils [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.624 2 DEBUG oslo_concurrency.lockutils [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.624 2 DEBUG nova.compute.manager [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Processing event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.624 2 DEBUG nova.compute.manager [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-changed-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.625 2 DEBUG nova.compute.manager [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Refreshing instance network info cache due to event network-changed-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.625 2 DEBUG oslo_concurrency.lockutils [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.625 2 DEBUG oslo_concurrency.lockutils [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.626 2 DEBUG nova.network.neutron [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Refreshing network info cache for port 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:10:42 compute-0 nova_compute[192716]: 2025-10-07 22:10:42.627 2 DEBUG nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.133 2 WARNING neutronclient.v2_0.client [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.141 2 DEBUG nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk15flo59',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='325ae6e1-77ba-444e-92cf-79c32803f073',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(533fc4dd-d9ee-4ddc-84c2-dc6fe98ca3dc),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.568 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.660 2 WARNING neutronclient.v2_0.client [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.667 2 DEBUG nova.objects.instance [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 325ae6e1-77ba-444e-92cf-79c32803f073 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.669 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.672 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.672 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.683 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.685 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.699 2 DEBUG nova.virt.libvirt.vif [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:09:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1186382154',id=21,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:09:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='571b320e0e5e447fa64ebcac1ce7ec0d',ramdisk_id='',reservation_id='r-43pjucrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:09:54Z,user_data=None,user_id='65d5e89c36a04afaa9a8bf3d1033a4f5',uuid=325ae6e1-77ba-444e-92cf-79c32803f073,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.700 2 DEBUG nova.network.os_vif_util [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.701 2 DEBUG nova.network.os_vif_util [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.702 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Updating guest XML with vif config: <interface type="ethernet">
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <mac address="fa:16:3e:e8:d6:ff"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <model type="virtio"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <mtu size="1442"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <target dev="tap9f555d6e-b6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]: </interface>
Oct 07 22:10:43 compute-0 nova_compute[192716]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.703 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <name>instance-00000015</name>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <uuid>325ae6e1-77ba-444e-92cf-79c32803f073</uuid>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154</nova:name>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:09:48</nova:creationTime>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:user uuid="65d5e89c36a04afaa9a8bf3d1033a4f5">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin</nova:user>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:project uuid="571b320e0e5e447fa64ebcac1ce7ec0d">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930</nova:project>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:port uuid="9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8">
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <memory unit="KiB">131072</memory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <vcpu placement="static">1</vcpu>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <resource>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <partition>/machine</partition>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </resource>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <system>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="serial">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="uuid">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </system>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <os>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </os>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <features>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <vmcoreinfo state="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </features>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <cpu mode="host-model" check="partial">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_poweroff>destroy</on_poweroff>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_reboot>restart</on_reboot>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_crash>destroy</on_crash>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <readonly/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="1" port="0x10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="2" port="0x11"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="3" port="0x12"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="4" port="0x13"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="5" port="0x14"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="6" port="0x15"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="7" port="0x16"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="8" port="0x17"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="9" port="0x18"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="10" port="0x19"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="11" port="0x1a"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="12" port="0x1b"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="13" port="0x1c"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="14" port="0x1d"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="15" port="0x1e"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="16" port="0x1f"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="17" port="0x20"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="18" port="0x21"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="19" port="0x22"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="20" port="0x23"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="21" port="0x24"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="22" port="0x25"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="23" port="0x26"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="24" port="0x27"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="25" port="0x28"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-pci-bridge"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="sata" index="0">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <interface type="ethernet"><mac address="fa:16:3e:e8:d6:ff"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9f555d6e-b6"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </interface><serial type="pty">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target type="isa-serial" port="0">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <model name="isa-serial"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </target>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <console type="pty">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target type="serial" port="0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </console>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="usb" bus="0" port="1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </input>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <input type="mouse" bus="ps2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <listen type="address" address="::"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </graphics>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <video>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model type="virtio" heads="1" primary="yes"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </video>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]: </domain>
Oct 07 22:10:43 compute-0 nova_compute[192716]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.706 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <name>instance-00000015</name>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <uuid>325ae6e1-77ba-444e-92cf-79c32803f073</uuid>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154</nova:name>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:09:48</nova:creationTime>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:user uuid="65d5e89c36a04afaa9a8bf3d1033a4f5">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin</nova:user>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:project uuid="571b320e0e5e447fa64ebcac1ce7ec0d">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930</nova:project>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:port uuid="9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8">
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <memory unit="KiB">131072</memory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <vcpu placement="static">1</vcpu>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <resource>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <partition>/machine</partition>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </resource>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <system>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="serial">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="uuid">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </system>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <os>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </os>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <features>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <vmcoreinfo state="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </features>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <cpu mode="host-model" check="partial">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_poweroff>destroy</on_poweroff>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_reboot>restart</on_reboot>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_crash>destroy</on_crash>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <readonly/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="1" port="0x10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="2" port="0x11"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="3" port="0x12"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="4" port="0x13"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="5" port="0x14"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="6" port="0x15"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="7" port="0x16"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="8" port="0x17"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="9" port="0x18"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="10" port="0x19"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="11" port="0x1a"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="12" port="0x1b"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="13" port="0x1c"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="14" port="0x1d"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="15" port="0x1e"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="16" port="0x1f"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="17" port="0x20"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="18" port="0x21"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="19" port="0x22"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="20" port="0x23"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="21" port="0x24"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="22" port="0x25"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="23" port="0x26"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="24" port="0x27"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="25" port="0x28"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-pci-bridge"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="sata" index="0">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <interface type="ethernet"><mac address="fa:16:3e:e8:d6:ff"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9f555d6e-b6"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </interface><serial type="pty">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target type="isa-serial" port="0">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <model name="isa-serial"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </target>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <console type="pty">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target type="serial" port="0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </console>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="usb" bus="0" port="1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </input>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <input type="mouse" bus="ps2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <listen type="address" address="::"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </graphics>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <video>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model type="virtio" heads="1" primary="yes"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </video>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]: </domain>
Oct 07 22:10:43 compute-0 nova_compute[192716]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.707 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <name>instance-00000015</name>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <uuid>325ae6e1-77ba-444e-92cf-79c32803f073</uuid>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154</nova:name>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:09:48</nova:creationTime>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:user uuid="65d5e89c36a04afaa9a8bf3d1033a4f5">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin</nova:user>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:project uuid="571b320e0e5e447fa64ebcac1ce7ec0d">tempest-TestExecuteVmWorkloadBalanceStrategy-664850930</nova:project>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <nova:port uuid="9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8">
Oct 07 22:10:43 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <memory unit="KiB">131072</memory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <vcpu placement="static">1</vcpu>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <resource>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <partition>/machine</partition>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </resource>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <system>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="serial">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="uuid">325ae6e1-77ba-444e-92cf-79c32803f073</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </system>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <os>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </os>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <features>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <vmcoreinfo state="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </features>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <cpu mode="host-model" check="partial">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_poweroff>destroy</on_poweroff>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_reboot>restart</on_reboot>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <on_crash>destroy</on_crash>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk.config"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <readonly/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="1" port="0x10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="2" port="0x11"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="3" port="0x12"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="4" port="0x13"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="5" port="0x14"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="6" port="0x15"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="7" port="0x16"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="8" port="0x17"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="9" port="0x18"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="10" port="0x19"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="11" port="0x1a"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="12" port="0x1b"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="13" port="0x1c"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="14" port="0x1d"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="15" port="0x1e"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="16" port="0x1f"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="17" port="0x20"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="18" port="0x21"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="19" port="0x22"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="20" port="0x23"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="21" port="0x24"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="22" port="0x25"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="23" port="0x26"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="24" port="0x27"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-root-port"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target chassis="25" port="0x28"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model name="pcie-pci-bridge"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <controller type="sata" index="0">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </controller>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <interface type="ethernet"><mac address="fa:16:3e:e8:d6:ff"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9f555d6e-b6"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </interface><serial type="pty">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target type="isa-serial" port="0">
Oct 07 22:10:43 compute-0 nova_compute[192716]:         <model name="isa-serial"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       </target>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <console type="pty">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/console.log" append="off"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <target type="serial" port="0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </console>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="usb" bus="0" port="1"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </input>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <input type="mouse" bus="ps2"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <listen type="address" address="::"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </graphics>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <video>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <model type="virtio" heads="1" primary="yes"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </video>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:10:43 compute-0 nova_compute[192716]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:10:43 compute-0 nova_compute[192716]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 07 22:10:43 compute-0 nova_compute[192716]: </domain>
Oct 07 22:10:43 compute-0 nova_compute[192716]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.707 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.751 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.821 2 DEBUG nova.network.neutron [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Updated VIF entry in instance network info cache for port 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.822 2 DEBUG nova.network.neutron [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Updating instance_info_cache with network_info: [{"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:10:43 compute-0 podman[223780]: 2025-10-07 22:10:43.854753758 +0000 UTC m=+0.099696939 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.925 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.926 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.948 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.948 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5646MB free_disk=73.27431106567383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.949 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:43 compute-0 nova_compute[192716]: 2025-10-07 22:10:43.949 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:44 compute-0 nova_compute[192716]: 2025-10-07 22:10:44.188 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 22:10:44 compute-0 nova_compute[192716]: 2025-10-07 22:10:44.189 2 INFO nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 07 22:10:44 compute-0 nova_compute[192716]: 2025-10-07 22:10:44.328 2 DEBUG oslo_concurrency.lockutils [req-5c3d7fe0-29e3-4e86-9c73-4dc3dd0471b4 req-842ce3ae-d683-4567-954f-3f931c712574 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-325ae6e1-77ba-444e-92cf-79c32803f073" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:10:44 compute-0 nova_compute[192716]: 2025-10-07 22:10:44.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:44 compute-0 nova_compute[192716]: 2025-10-07 22:10:44.969 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Updating resource usage from migration 533fc4dd-d9ee-4ddc-84c2-dc6fe98ca3dc
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.004 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration 533fc4dd-d9ee-4ddc-84c2-dc6fe98ca3dc is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.004 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.005 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:10:43 up  1:19,  0 user,  load average: 0.36, 0.25, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_571b320e0e5e447fa64ebcac1ce7ec0d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.042 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.214 2 INFO nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.550 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.718 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.719 2 DEBUG nova.virt.libvirt.migration [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 07 22:10:45 compute-0 kernel: tap9f555d6e-b6 (unregistering): left promiscuous mode
Oct 07 22:10:45 compute-0 NetworkManager[51722]: <info>  [1759875045.8209] device (tap9f555d6e-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:10:45 compute-0 ovn_controller[94904]: 2025-10-07T22:10:45Z|00197|binding|INFO|Releasing lport 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 from this chassis (sb_readonly=0)
Oct 07 22:10:45 compute-0 ovn_controller[94904]: 2025-10-07T22:10:45Z|00198|binding|INFO|Setting lport 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 down in Southbound
Oct 07 22:10:45 compute-0 ovn_controller[94904]: 2025-10-07T22:10:45Z|00199|binding|INFO|Removing iface tap9f555d6e-b6 ovn-installed in OVS
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:45.839 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:d6:ff 10.100.0.3'], port_security=['fa:16:3e:e8:d6:ff 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '89a2e214-6e2f-462a-b578-1487fac3513c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '325ae6e1-77ba-444e-92cf-79c32803f073', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '571b320e0e5e447fa64ebcac1ce7ec0d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '60ca1ecb-7714-4c2f-91a8-8375ff264bfb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc4b6ec6-4880-401c-a20b-f966847f0277, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:10:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:45.842 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 in datapath ea3a75de-7deb-4587-bd4b-e492c51c608d unbound from our chassis
Oct 07 22:10:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:45.844 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea3a75de-7deb-4587-bd4b-e492c51c608d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:10:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:45.848 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce60eed-f39c-4a25-ac39-ba4b3f4a8a31]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:45.849 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d namespace which is not needed anymore
Oct 07 22:10:45 compute-0 nova_compute[192716]: 2025-10-07 22:10:45.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:45 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct 07 22:10:45 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 14.039s CPU time.
Oct 07 22:10:45 compute-0 systemd-machined[152719]: Machine qemu-16-instance-00000015 terminated.
Oct 07 22:10:46 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [NOTICE]   (223531) : haproxy version is 3.0.5-8e879a5
Oct 07 22:10:46 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [NOTICE]   (223531) : path to executable is /usr/sbin/haproxy
Oct 07 22:10:46 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [WARNING]  (223531) : Exiting Master process...
Oct 07 22:10:46 compute-0 podman[223840]: 2025-10-07 22:10:46.009074219 +0000 UTC m=+0.049681035 container kill 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 22:10:46 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [ALERT]    (223531) : Current worker (223533) exited with code 143 (Terminated)
Oct 07 22:10:46 compute-0 neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d[223527]: [WARNING]  (223531) : All workers exited. Exiting... (0)
Oct 07 22:10:46 compute-0 systemd[1]: libpod-7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b.scope: Deactivated successfully.
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.068 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.068 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:46 compute-0 podman[223857]: 2025-10-07 22:10:46.079697918 +0000 UTC m=+0.040210482 container died 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.108 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.109 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.109 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 07 22:10:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b-userdata-shm.mount: Deactivated successfully.
Oct 07 22:10:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-0302266d0ee372b41aac1287014cb33f3ee01cd6444d7ca6e6ca91286db1f19e-merged.mount: Deactivated successfully.
Oct 07 22:10:46 compute-0 podman[223857]: 2025-10-07 22:10:46.134549131 +0000 UTC m=+0.095061665 container cleanup 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:10:46 compute-0 systemd[1]: libpod-conmon-7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b.scope: Deactivated successfully.
Oct 07 22:10:46 compute-0 podman[223864]: 2025-10-07 22:10:46.158166043 +0000 UTC m=+0.099331348 container remove 7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.177 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[03145198-c89f-4e6c-ad69-ccc0f1243b92]: (4, ("Tue Oct  7 10:10:45 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d (7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b)\n7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b\nTue Oct  7 10:10:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d (7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b)\n7afbaf8205940d5d74ba2edbb492e9720602fd672a6070c4b6297e600b4f275b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.179 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6a401bc4-f4b3-41f4-8089-45a6d85527d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.180 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea3a75de-7deb-4587-bd4b-e492c51c608d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.180 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b1eeb4-ee04-4742-ad1e-1816d2b0e854]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.181 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea3a75de-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 kernel: tapea3a75de-70: left promiscuous mode
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.217 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[54263c34-c822-4fa8-94b3-0be2c0ed79db]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.220 2 DEBUG nova.virt.libvirt.guest [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '325ae6e1-77ba-444e-92cf-79c32803f073' (instance-00000015) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.221 2 INFO nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migration operation has completed
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.222 2 INFO nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] _post_live_migration() is started..
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.241 2 WARNING neutronclient.v2_0.client [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.242 2 WARNING neutronclient.v2_0.client [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.247 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4349f6-7caf-41b0-8c36-a549d05e7d74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.248 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6881740a-0409-4178-a2c4-851bbbba8de7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.272 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7cde0056-3641-43a1-8a41-c86f347343b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472626, 'reachable_time': 21562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223901, 'error': None, 'target': 'ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.275 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea3a75de-7deb-4587-bd4b-e492c51c608d deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.275 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[e84aa5a1-0537-4bb4-97ba-111a6fae9f1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:10:46 compute-0 systemd[1]: run-netns-ovnmeta\x2dea3a75de\x2d7deb\x2d4587\x2dbd4b\x2de492c51c608d.mount: Deactivated successfully.
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.447 2 DEBUG nova.compute.manager [req-1a98e76a-f407-43b1-9a2d-a777adab8a65 req-83659e42-7911-4f19-8786-5e88eaeb74ce 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.448 2 DEBUG oslo_concurrency.lockutils [req-1a98e76a-f407-43b1-9a2d-a777adab8a65 req-83659e42-7911-4f19-8786-5e88eaeb74ce 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.450 2 DEBUG oslo_concurrency.lockutils [req-1a98e76a-f407-43b1-9a2d-a777adab8a65 req-83659e42-7911-4f19-8786-5e88eaeb74ce 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.451 2 DEBUG oslo_concurrency.lockutils [req-1a98e76a-f407-43b1-9a2d-a777adab8a65 req-83659e42-7911-4f19-8786-5e88eaeb74ce 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.452 2 DEBUG nova.compute.manager [req-1a98e76a-f407-43b1-9a2d-a777adab8a65 req-83659e42-7911-4f19-8786-5e88eaeb74ce 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.453 2 DEBUG nova.compute.manager [req-1a98e76a-f407-43b1-9a2d-a777adab8a65 req-83659e42-7911-4f19-8786-5e88eaeb74ce 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.598 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:46.599 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.717 2 DEBUG nova.compute.manager [req-ad207fbd-75da-4434-9cf4-739a6674f844 req-019458c4-cfe1-4cb3-8a35-7555c53a908a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.717 2 DEBUG oslo_concurrency.lockutils [req-ad207fbd-75da-4434-9cf4-739a6674f844 req-019458c4-cfe1-4cb3-8a35-7555c53a908a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.718 2 DEBUG oslo_concurrency.lockutils [req-ad207fbd-75da-4434-9cf4-739a6674f844 req-019458c4-cfe1-4cb3-8a35-7555c53a908a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.719 2 DEBUG oslo_concurrency.lockutils [req-ad207fbd-75da-4434-9cf4-739a6674f844 req-019458c4-cfe1-4cb3-8a35-7555c53a908a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.719 2 DEBUG nova.compute.manager [req-ad207fbd-75da-4434-9cf4-739a6674f844 req-019458c4-cfe1-4cb3-8a35-7555c53a908a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.720 2 DEBUG nova.compute.manager [req-ad207fbd-75da-4434-9cf4-739a6674f844 req-019458c4-cfe1-4cb3-8a35-7555c53a908a 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.837 2 DEBUG nova.network.neutron [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Activated binding for port 9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.838 2 DEBUG nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.840 2 DEBUG nova.virt.libvirt.vif [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:09:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1186382154',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1186382154',id=21,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:09:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='571b320e0e5e447fa64ebcac1ce7ec0d',ramdisk_id='',reservation_id='r-43pjucrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-664850930-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:10:24Z,user_data=None,user_id='65d5e89c36a04afaa9a8bf3d1033a4f5',uuid=325ae6e1-77ba-444e-92cf-79c32803f073,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.840 2 DEBUG nova.network.os_vif_util [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "address": "fa:16:3e:e8:d6:ff", "network": {"id": "ea3a75de-7deb-4587-bd4b-e492c51c608d", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1832409876-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6006393ea657476389ab742b0f55b598", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f555d6e-b6", "ovs_interfaceid": "9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.841 2 DEBUG nova.network.os_vif_util [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.842 2 DEBUG os_vif [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f555d6e-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.852 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2f79641c-f6b8-480c-8558-b4f74f816b28) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.857 2 INFO os_vif [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:ff,bridge_name='br-int',has_traffic_filtering=True,id=9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8,network=Network(ea3a75de-7deb-4587-bd4b-e492c51c608d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f555d6e-b6')
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.858 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.858 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.859 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.859 2 DEBUG nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.860 2 INFO nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Deleting instance files /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073_del
Oct 07 22:10:46 compute-0 nova_compute[192716]: 2025-10-07 22:10:46.862 2 INFO nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Deletion of /var/lib/nova/instances/325ae6e1-77ba-444e-92cf-79c32803f073_del complete
Oct 07 22:10:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:10:47.600 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.496 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.497 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.497 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.497 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.498 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.498 2 WARNING nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received unexpected event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with vm_state active and task_state migrating.
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.498 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.498 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.499 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.499 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.499 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.499 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-unplugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.499 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.500 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.500 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.500 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.500 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.500 2 WARNING nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received unexpected event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with vm_state active and task_state migrating.
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.500 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.501 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.501 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.501 2 DEBUG oslo_concurrency.lockutils [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.501 2 DEBUG nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] No waiting events found dispatching network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:10:48 compute-0 nova_compute[192716]: 2025-10-07 22:10:48.501 2 WARNING nova.compute.manager [req-ad6e7d74-7a2e-4fe8-bcd9-395f415304b4 req-efdbf740-e2e6-49d9-9f9f-9041881fe685 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Received unexpected event network-vif-plugged-9f555d6e-b6bb-4c0b-8aa7-037cc3316fc8 for instance with vm_state active and task_state migrating.
Oct 07 22:10:49 compute-0 nova_compute[192716]: 2025-10-07 22:10:49.069 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:49 compute-0 nova_compute[192716]: 2025-10-07 22:10:49.070 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:49 compute-0 nova_compute[192716]: 2025-10-07 22:10:49.070 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:10:50 compute-0 nova_compute[192716]: 2025-10-07 22:10:50.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:51 compute-0 nova_compute[192716]: 2025-10-07 22:10:51.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:51 compute-0 podman[223904]: 2025-10-07 22:10:51.889543055 +0000 UTC m=+0.116924906 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:10:53 compute-0 podman[223931]: 2025-10-07 22:10:53.825661847 +0000 UTC m=+0.060612221 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.398 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.399 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.399 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "325ae6e1-77ba-444e-92cf-79c32803f073-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.914 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.915 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.915 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:55 compute-0 nova_compute[192716]: 2025-10-07 22:10:55.916 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.080 2 WARNING nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.081 2 DEBUG oslo_concurrency.processutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.099 2 DEBUG oslo_concurrency.processutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.100 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5803MB free_disk=73.30352783203125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.101 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.101 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:10:56 compute-0 nova_compute[192716]: 2025-10-07 22:10:56.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:10:57 compute-0 nova_compute[192716]: 2025-10-07 22:10:57.122 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Migration for instance 325ae6e1-77ba-444e-92cf-79c32803f073 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 22:10:57 compute-0 nova_compute[192716]: 2025-10-07 22:10:57.631 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 07 22:10:57 compute-0 nova_compute[192716]: 2025-10-07 22:10:57.664 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Migration 533fc4dd-d9ee-4ddc-84c2-dc6fe98ca3dc is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 07 22:10:57 compute-0 nova_compute[192716]: 2025-10-07 22:10:57.665 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:10:57 compute-0 nova_compute[192716]: 2025-10-07 22:10:57.665 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:10:56 up  1:19,  0 user,  load average: 0.38, 0.26, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:10:57 compute-0 nova_compute[192716]: 2025-10-07 22:10:57.706 2 DEBUG nova.compute.provider_tree [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:10:57 compute-0 podman[223952]: 2025-10-07 22:10:57.865142016 +0000 UTC m=+0.100302315 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 22:10:58 compute-0 nova_compute[192716]: 2025-10-07 22:10:58.214 2 DEBUG nova.scheduler.client.report [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:10:58 compute-0 nova_compute[192716]: 2025-10-07 22:10:58.723 2 DEBUG nova.compute.resource_tracker [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:10:58 compute-0 nova_compute[192716]: 2025-10-07 22:10:58.724 2 DEBUG oslo_concurrency.lockutils [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.623s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:10:58 compute-0 nova_compute[192716]: 2025-10-07 22:10:58.736 2 INFO nova.compute.manager [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 07 22:10:59 compute-0 podman[203153]: time="2025-10-07T22:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:10:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:10:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 07 22:10:59 compute-0 nova_compute[192716]: 2025-10-07 22:10:59.843 2 INFO nova.scheduler.client.report [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Deleted allocation for migration 533fc4dd-d9ee-4ddc-84c2-dc6fe98ca3dc
Oct 07 22:10:59 compute-0 nova_compute[192716]: 2025-10-07 22:10:59.843 2 DEBUG nova.virt.libvirt.driver [None req-65c76fa7-b20b-4812-8ad2-1c19b42990b6 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 325ae6e1-77ba-444e-92cf-79c32803f073] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 07 22:11:00 compute-0 nova_compute[192716]: 2025-10-07 22:11:00.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: ERROR   22:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: ERROR   22:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: ERROR   22:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: ERROR   22:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: ERROR   22:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:11:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:11:01 compute-0 anacron[4762]: Job `cron.weekly' started
Oct 07 22:11:01 compute-0 anacron[4762]: Job `cron.weekly' terminated
Oct 07 22:11:01 compute-0 nova_compute[192716]: 2025-10-07 22:11:01.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:05 compute-0 nova_compute[192716]: 2025-10-07 22:11:05.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:06 compute-0 nova_compute[192716]: 2025-10-07 22:11:06.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:10 compute-0 nova_compute[192716]: 2025-10-07 22:11:10.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:11 compute-0 nova_compute[192716]: 2025-10-07 22:11:11.526 2 DEBUG nova.compute.manager [None req-83b1a531-cefd-43e3-81c0-123dae5c04aa 2c71b2f9f101437eaf6e12c33825a1df 293ff4341f3d48a4ae100bf4fc7b99bd - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 07 22:11:11 compute-0 nova_compute[192716]: 2025-10-07 22:11:11.599 2 DEBUG nova.compute.provider_tree [None req-83b1a531-cefd-43e3-81c0-123dae5c04aa 2c71b2f9f101437eaf6e12c33825a1df 293ff4341f3d48a4ae100bf4fc7b99bd - - default default] Updating resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 generation from 24 to 27 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 07 22:11:11 compute-0 podman[223976]: 2025-10-07 22:11:11.817925802 +0000 UTC m=+0.061558927 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct 07 22:11:11 compute-0 podman[223977]: 2025-10-07 22:11:11.822778753 +0000 UTC m=+0.061206268 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 07 22:11:11 compute-0 nova_compute[192716]: 2025-10-07 22:11:11.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:14 compute-0 nova_compute[192716]: 2025-10-07 22:11:14.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:14 compute-0 podman[224014]: 2025-10-07 22:11:14.855030187 +0000 UTC m=+0.084406208 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:11:15 compute-0 nova_compute[192716]: 2025-10-07 22:11:15.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:16 compute-0 nova_compute[192716]: 2025-10-07 22:11:16.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:20 compute-0 nova_compute[192716]: 2025-10-07 22:11:20.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:21 compute-0 nova_compute[192716]: 2025-10-07 22:11:21.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:22 compute-0 podman[224038]: 2025-10-07 22:11:22.908369458 +0000 UTC m=+0.146177700 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Oct 07 22:11:24 compute-0 podman[224064]: 2025-10-07 22:11:24.843406109 +0000 UTC m=+0.076412277 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 07 22:11:25 compute-0 nova_compute[192716]: 2025-10-07 22:11:25.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.646 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.647 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.647 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.742 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:d8:54 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3de874d926748bd99c4598b8d738295', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fef08549-83ea-44ba-a65f-44ca77f5b5f1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b4222228-ac79-43ed-9842-3bc20abbb5e0) old=Port_Binding(mac=['fa:16:3e:47:d8:54'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3de874d926748bd99c4598b8d738295', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.743 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b4222228-ac79-43ed-9842-3bc20abbb5e0 in datapath 3a911305-1c41-4e8b-b203-9200b81948ee updated
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.744 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a911305-1c41-4e8b-b203-9200b81948ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:11:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:25.746 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8e95db3f-9daf-44a9-bd7a-b51e0668e808]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:11:26 compute-0 nova_compute[192716]: 2025-10-07 22:11:26.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:28 compute-0 podman[224084]: 2025-10-07 22:11:28.833944937 +0000 UTC m=+0.070593899 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 22:11:29 compute-0 podman[203153]: time="2025-10-07T22:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:11:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:11:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 07 22:11:30 compute-0 nova_compute[192716]: 2025-10-07 22:11:30.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: ERROR   22:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: ERROR   22:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: ERROR   22:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: ERROR   22:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: ERROR   22:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:11:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:11:31 compute-0 nova_compute[192716]: 2025-10-07 22:11:31.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:31 compute-0 nova_compute[192716]: 2025-10-07 22:11:31.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:33 compute-0 nova_compute[192716]: 2025-10-07 22:11:33.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:33 compute-0 nova_compute[192716]: 2025-10-07 22:11:33.496 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:11:33 compute-0 nova_compute[192716]: 2025-10-07 22:11:33.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:34 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:34.711 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:8b:57 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-beac7d2f-2b32-4622-9284-7921f91772e2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beac7d2f-2b32-4622-9284-7921f91772e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5a4169ef6e443b4a1f43aa9ac237c66', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfccecd0-a71d-429b-a71b-96aab49fe872, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8d2ba838-57a3-4144-862f-5cf30824905a) old=Port_Binding(mac=['fa:16:3e:98:8b:57'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-beac7d2f-2b32-4622-9284-7921f91772e2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beac7d2f-2b32-4622-9284-7921f91772e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5a4169ef6e443b4a1f43aa9ac237c66', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:11:34 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:34.712 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8d2ba838-57a3-4144-862f-5cf30824905a in datapath beac7d2f-2b32-4622-9284-7921f91772e2 updated
Oct 07 22:11:34 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:34.713 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network beac7d2f-2b32-4622-9284-7921f91772e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:11:34 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:34.714 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d98d299c-01d8-4ae4-8d1b-a950d7cafd8b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:11:35 compute-0 nova_compute[192716]: 2025-10-07 22:11:35.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:36 compute-0 nova_compute[192716]: 2025-10-07 22:11:36.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:36 compute-0 nova_compute[192716]: 2025-10-07 22:11:36.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:40 compute-0 nova_compute[192716]: 2025-10-07 22:11:40.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:41 compute-0 nova_compute[192716]: 2025-10-07 22:11:41.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:41 compute-0 nova_compute[192716]: 2025-10-07 22:11:41.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:42 compute-0 podman[224107]: 2025-10-07 22:11:42.823950587 +0000 UTC m=+0.062758412 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 07 22:11:42 compute-0 podman[224108]: 2025-10-07 22:11:42.828831438 +0000 UTC m=+0.067724106 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:11:42 compute-0 nova_compute[192716]: 2025-10-07 22:11:42.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:43 compute-0 sshd-session[224105]: Invalid user github from 103.115.24.11 port 38660
Oct 07 22:11:43 compute-0 sshd-session[224105]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:11:43 compute-0 sshd-session[224105]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:11:43 compute-0 nova_compute[192716]: 2025-10-07 22:11:43.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.016 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.016 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.016 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.017 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.172 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.173 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.200 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.201 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5830MB free_disk=73.30289459228516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.201 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:11:44 compute-0 nova_compute[192716]: 2025-10-07 22:11:44.202 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:11:44 compute-0 sshd-session[224105]: Failed password for invalid user github from 103.115.24.11 port 38660 ssh2
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.246 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.247 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:11:44 up  1:20,  0 user,  load average: 0.16, 0.22, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.269 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.289 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.290 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.303 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.320 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.339 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:45 compute-0 sshd-session[224105]: Received disconnect from 103.115.24.11 port 38660:11: Bye Bye [preauth]
Oct 07 22:11:45 compute-0 sshd-session[224105]: Disconnected from invalid user github 103.115.24.11 port 38660 [preauth]
Oct 07 22:11:45 compute-0 podman[224147]: 2025-10-07 22:11:45.8121624 +0000 UTC m=+0.057642255 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:11:45 compute-0 nova_compute[192716]: 2025-10-07 22:11:45.846 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:11:46 compute-0 nova_compute[192716]: 2025-10-07 22:11:46.356 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:11:46 compute-0 nova_compute[192716]: 2025-10-07 22:11:46.357 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:11:46 compute-0 nova_compute[192716]: 2025-10-07 22:11:46.357 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:46 compute-0 nova_compute[192716]: 2025-10-07 22:11:46.357 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 22:11:46 compute-0 nova_compute[192716]: 2025-10-07 22:11:46.866 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 22:11:46 compute-0 nova_compute[192716]: 2025-10-07 22:11:46.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:48 compute-0 ovn_controller[94904]: 2025-10-07T22:11:48Z|00200|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 07 22:11:50 compute-0 nova_compute[192716]: 2025-10-07 22:11:50.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:50 compute-0 nova_compute[192716]: 2025-10-07 22:11:50.359 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:50 compute-0 nova_compute[192716]: 2025-10-07 22:11:50.360 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:50 compute-0 nova_compute[192716]: 2025-10-07 22:11:50.360 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:51 compute-0 nova_compute[192716]: 2025-10-07 22:11:51.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:52.589 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:11:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:52.590 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:11:52 compute-0 nova_compute[192716]: 2025-10-07 22:11:52.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:53 compute-0 podman[224172]: 2025-10-07 22:11:53.860625231 +0000 UTC m=+0.104800615 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 07 22:11:53 compute-0 nova_compute[192716]: 2025-10-07 22:11:53.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:11:53 compute-0 nova_compute[192716]: 2025-10-07 22:11:53.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 22:11:55 compute-0 nova_compute[192716]: 2025-10-07 22:11:55.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:55 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:11:55.591 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:11:55 compute-0 podman[224198]: 2025-10-07 22:11:55.8455016 +0000 UTC m=+0.074801360 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 07 22:11:56 compute-0 nova_compute[192716]: 2025-10-07 22:11:56.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:11:59 compute-0 podman[203153]: time="2025-10-07T22:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:11:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:11:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 07 22:11:59 compute-0 podman[224217]: 2025-10-07 22:11:59.840069734 +0000 UTC m=+0.076149129 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct 07 22:12:00 compute-0 nova_compute[192716]: 2025-10-07 22:12:00.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: ERROR   22:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: ERROR   22:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: ERROR   22:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: ERROR   22:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: ERROR   22:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:12:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:12:01 compute-0 nova_compute[192716]: 2025-10-07 22:12:01.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:05 compute-0 nova_compute[192716]: 2025-10-07 22:12:05.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:06 compute-0 nova_compute[192716]: 2025-10-07 22:12:06.780 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:06 compute-0 nova_compute[192716]: 2025-10-07 22:12:06.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:10 compute-0 nova_compute[192716]: 2025-10-07 22:12:10.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:11 compute-0 nova_compute[192716]: 2025-10-07 22:12:11.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:13 compute-0 podman[224239]: 2025-10-07 22:12:13.818136638 +0000 UTC m=+0.059161859 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid)
Oct 07 22:12:13 compute-0 podman[224240]: 2025-10-07 22:12:13.850817232 +0000 UTC m=+0.079699562 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:12:15 compute-0 nova_compute[192716]: 2025-10-07 22:12:15.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:16 compute-0 podman[224278]: 2025-10-07 22:12:16.815638539 +0000 UTC m=+0.053521976 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:12:16 compute-0 nova_compute[192716]: 2025-10-07 22:12:16.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:20 compute-0 nova_compute[192716]: 2025-10-07 22:12:20.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:21 compute-0 nova_compute[192716]: 2025-10-07 22:12:21.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:24 compute-0 podman[224302]: 2025-10-07 22:12:24.859001662 +0000 UTC m=+0.101656715 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 22:12:25 compute-0 nova_compute[192716]: 2025-10-07 22:12:25.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:25.648 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:12:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:25.648 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:12:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:25.649 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:12:26 compute-0 podman[224329]: 2025-10-07 22:12:26.851230404 +0000 UTC m=+0.093358646 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 07 22:12:26 compute-0 nova_compute[192716]: 2025-10-07 22:12:26.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:29 compute-0 podman[203153]: time="2025-10-07T22:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:12:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:12:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:12:30 compute-0 nova_compute[192716]: 2025-10-07 22:12:30.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:30 compute-0 podman[224348]: 2025-10-07 22:12:30.824217526 +0000 UTC m=+0.063417212 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: ERROR   22:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: ERROR   22:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: ERROR   22:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: ERROR   22:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: ERROR   22:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:12:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:12:31 compute-0 nova_compute[192716]: 2025-10-07 22:12:31.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:34 compute-0 nova_compute[192716]: 2025-10-07 22:12:34.993 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:34 compute-0 nova_compute[192716]: 2025-10-07 22:12:34.993 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:34 compute-0 nova_compute[192716]: 2025-10-07 22:12:34.993 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:12:35 compute-0 nova_compute[192716]: 2025-10-07 22:12:35.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:36 compute-0 nova_compute[192716]: 2025-10-07 22:12:36.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:36 compute-0 nova_compute[192716]: 2025-10-07 22:12:36.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:40 compute-0 nova_compute[192716]: 2025-10-07 22:12:40.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:41 compute-0 nova_compute[192716]: 2025-10-07 22:12:41.335 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Creating tmpfile /var/lib/nova/instances/tmp4_bl0ime to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:12:41 compute-0 nova_compute[192716]: 2025-10-07 22:12:41.336 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:12:41 compute-0 nova_compute[192716]: 2025-10-07 22:12:41.340 2 DEBUG nova.compute.manager [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_bl0ime',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:12:41 compute-0 nova_compute[192716]: 2025-10-07 22:12:41.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:42 compute-0 nova_compute[192716]: 2025-10-07 22:12:42.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.394 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.503 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.664 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.665 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.687 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.688 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5866MB free_disk=73.30300903320312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.688 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:12:43 compute-0 nova_compute[192716]: 2025-10-07 22:12:43.689 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:12:44 compute-0 podman[224373]: 2025-10-07 22:12:44.855408285 +0000 UTC m=+0.086566490 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true)
Oct 07 22:12:44 compute-0 podman[224372]: 2025-10-07 22:12:44.858184885 +0000 UTC m=+0.090109392 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007)
Oct 07 22:12:45 compute-0 nova_compute[192716]: 2025-10-07 22:12:45.246 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance cc1ed00e-556f-40b9-8d60-91367908a213 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 07 22:12:45 compute-0 nova_compute[192716]: 2025-10-07 22:12:45.246 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:12:45 compute-0 nova_compute[192716]: 2025-10-07 22:12:45.246 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:12:43 up  1:21,  0 user,  load average: 0.06, 0.17, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:12:45 compute-0 nova_compute[192716]: 2025-10-07 22:12:45.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:45 compute-0 nova_compute[192716]: 2025-10-07 22:12:45.425 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:12:45 compute-0 nova_compute[192716]: 2025-10-07 22:12:45.934 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:12:46 compute-0 nova_compute[192716]: 2025-10-07 22:12:46.445 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:12:46 compute-0 nova_compute[192716]: 2025-10-07 22:12:46.446 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.757s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:12:46 compute-0 nova_compute[192716]: 2025-10-07 22:12:46.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:47 compute-0 nova_compute[192716]: 2025-10-07 22:12:47.447 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:47 compute-0 nova_compute[192716]: 2025-10-07 22:12:47.448 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:47 compute-0 nova_compute[192716]: 2025-10-07 22:12:47.448 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:47 compute-0 nova_compute[192716]: 2025-10-07 22:12:47.477 2 DEBUG nova.compute.manager [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_bl0ime',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cc1ed00e-556f-40b9-8d60-91367908a213',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:12:47 compute-0 podman[224412]: 2025-10-07 22:12:47.852905345 +0000 UTC m=+0.090707149 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:12:48 compute-0 nova_compute[192716]: 2025-10-07 22:12:48.494 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-cc1ed00e-556f-40b9-8d60-91367908a213" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:12:48 compute-0 nova_compute[192716]: 2025-10-07 22:12:48.495 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-cc1ed00e-556f-40b9-8d60-91367908a213" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:12:48 compute-0 nova_compute[192716]: 2025-10-07 22:12:48.495 2 DEBUG nova.network.neutron [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:12:48 compute-0 nova_compute[192716]: 2025-10-07 22:12:48.499 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:49 compute-0 nova_compute[192716]: 2025-10-07 22:12:49.002 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:12:49 compute-0 nova_compute[192716]: 2025-10-07 22:12:49.782 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:12:49 compute-0 nova_compute[192716]: 2025-10-07 22:12:49.936 2 DEBUG nova.network.neutron [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Updating instance_info_cache with network_info: [{"id": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "address": "fa:16:3e:81:bc:ba", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce7c20f0-f2", "ovs_interfaceid": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.444 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-cc1ed00e-556f-40b9-8d60-91367908a213" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.467 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_bl0ime',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cc1ed00e-556f-40b9-8d60-91367908a213',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.469 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Creating instance directory: /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.469 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Creating disk.info with the contents: {'/var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk': 'qcow2', '/var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.469 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.470 2 DEBUG nova.objects.instance [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid cc1ed00e-556f-40b9-8d60-91367908a213 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.977 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.983 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.988 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:50 compute-0 nova_compute[192716]: 2025-10-07 22:12:50.997 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.055 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.056 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.057 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.058 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.060 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.061 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.118 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.119 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.154 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.155 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.156 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.208 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.209 2 DEBUG nova.virt.disk.api [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.209 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.270 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.272 2 DEBUG nova.virt.disk.api [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.272 2 DEBUG nova.objects.instance [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid cc1ed00e-556f-40b9-8d60-91367908a213 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.779 2 DEBUG nova.objects.base [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<cc1ed00e-556f-40b9-8d60-91367908a213> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.780 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.823 2 DEBUG oslo_concurrency.processutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213/disk.config 497664" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.823 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.825 2 DEBUG nova.virt.libvirt.vif [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:11:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-372835884',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-372835884',id=22,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:12:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d5a4169ef6e443b4a1f43aa9ac237c66',ramdisk_id='',reservation_id='r-whnkbozb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:12:04Z,user_data=None,user_id='ce5b7880b5ed459d9196d63a71180641',uuid=cc1ed00e-556f-40b9-8d60-91367908a213,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "address": "fa:16:3e:81:bc:ba", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapce7c20f0-f2", "ovs_interfaceid": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.825 2 DEBUG nova.network.os_vif_util [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "address": "fa:16:3e:81:bc:ba", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapce7c20f0-f2", "ovs_interfaceid": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.826 2 DEBUG nova.network.os_vif_util [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:bc:ba,bridge_name='br-int',has_traffic_filtering=True,id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce7c20f0-f2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.826 2 DEBUG os_vif [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:bc:ba,bridge_name='br-int',has_traffic_filtering=True,id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce7c20f0-f2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.827 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e2689257-a24b-570d-851b-06411a2c9d0e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.836 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce7c20f0-f2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.836 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapce7c20f0-f2, col_values=(('qos', UUID('d37697be-2867-468a-aae2-33d3ec7d157e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.836 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapce7c20f0-f2, col_values=(('external_ids', {'iface-id': 'ce7c20f0-f260-4b47-9e8d-4fd23be68ce8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:bc:ba', 'vm-uuid': 'cc1ed00e-556f-40b9-8d60-91367908a213'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 NetworkManager[51722]: <info>  [1759875171.8394] manager: (tapce7c20f0-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.850 2 INFO os_vif [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:bc:ba,bridge_name='br-int',has_traffic_filtering=True,id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce7c20f0-f2')
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.851 2 DEBUG nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.851 2 DEBUG nova.compute.manager [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_bl0ime',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cc1ed00e-556f-40b9-8d60-91367908a213',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:12:51 compute-0 nova_compute[192716]: 2025-10-07 22:12:51.852 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:12:52 compute-0 nova_compute[192716]: 2025-10-07 22:12:52.292 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:12:52 compute-0 nova_compute[192716]: 2025-10-07 22:12:52.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:52.791 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:12:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:52.792 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:12:53 compute-0 nova_compute[192716]: 2025-10-07 22:12:53.332 2 DEBUG nova.network.neutron [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Port ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:12:53 compute-0 nova_compute[192716]: 2025-10-07 22:12:53.346 2 DEBUG nova.compute.manager [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4_bl0ime',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cc1ed00e-556f-40b9-8d60-91367908a213',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:12:55 compute-0 nova_compute[192716]: 2025-10-07 22:12:55.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:55 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:55.794 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:55 compute-0 podman[224457]: 2025-10-07 22:12:55.864445979 +0000 UTC m=+0.109446040 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:12:56 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 22:12:56 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 22:12:56 compute-0 kernel: tapce7c20f0-f2: entered promiscuous mode
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 NetworkManager[51722]: <info>  [1759875176.5125] manager: (tapce7c20f0-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Oct 07 22:12:56 compute-0 ovn_controller[94904]: 2025-10-07T22:12:56Z|00201|binding|INFO|Claiming lport ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 for this additional chassis.
Oct 07 22:12:56 compute-0 ovn_controller[94904]: 2025-10-07T22:12:56Z|00202|binding|INFO|ce7c20f0-f260-4b47-9e8d-4fd23be68ce8: Claiming fa:16:3e:81:bc:ba 10.100.0.9
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.531 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:bc:ba 10.100.0.9'], port_security=['fa:16:3e:81:bc:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cc1ed00e-556f-40b9-8d60-91367908a213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5a4169ef6e443b4a1f43aa9ac237c66', 'neutron:revision_number': '10', 'neutron:security_group_ids': '294a90bf-037a-4fa5-8621-501690236cf0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fef08549-83ea-44ba-a65f-44ca77f5b5f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.531 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 in datapath 3a911305-1c41-4e8b-b203-9200b81948ee unbound from our chassis
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.532 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a911305-1c41-4e8b-b203-9200b81948ee
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.545 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3388be54-92ab-453f-9e8c-2473a693afd9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.546 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a911305-11 in ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:12:56 compute-0 systemd-udevd[224513]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.549 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a911305-10 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.549 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a4f7d7-f198-41c1-8816-ca5fc7137c94]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.552 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6bd321-3d9a-432d-9174-b44ea6c8fe32]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 NetworkManager[51722]: <info>  [1759875176.5662] device (tapce7c20f0-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:12:56 compute-0 NetworkManager[51722]: <info>  [1759875176.5673] device (tapce7c20f0-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.569 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[58ecf2eb-2cad-49e9-abdb-c0902a586416]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 systemd-machined[152719]: New machine qemu-17-instance-00000016.
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.607 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8f6461db-bb00-4468-b442-d7153fe24b89]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000016.
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_controller[94904]: 2025-10-07T22:12:56Z|00203|binding|INFO|Setting lport ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 ovn-installed in OVS
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.648 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[1caf4c3b-94b9-416f-8715-b974fd0fd92d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 NetworkManager[51722]: <info>  [1759875176.6549] manager: (tap3a911305-10): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.653 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[29ee6534-ad84-4a7c-8e39-accba8830620]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.692 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[0de58d3a-acd5-4990-bd0c-31cdbf3855bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.696 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[db4dde48-b24e-4ccf-937a-1b4cd0a83580]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 NetworkManager[51722]: <info>  [1759875176.7175] device (tap3a911305-10): carrier: link connected
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.721 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[7011d556-9bd3-40ec-a4d2-aa67b6b99443]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.739 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[28648779-1be6-493c-aa14-a3a5fabb87d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a911305-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d8:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491029, 'reachable_time': 42016, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224548, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.754 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d3108753-41ef-4d80-8b98-78cbb1d98dba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:d854'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 491029, 'tstamp': 491029}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224549, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.775 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[566b6c5d-710b-4c9a-9948-b2baa220d9dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a911305-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d8:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491029, 'reachable_time': 42016, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224550, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.809 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[38a569b8-6611-4dea-88c7-f7b24842a846]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.886 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b8becf17-3fa2-4712-97be-9114aeb06833]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.888 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a911305-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.888 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.889 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a911305-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 NetworkManager[51722]: <info>  [1759875176.8924] manager: (tap3a911305-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Oct 07 22:12:56 compute-0 kernel: tap3a911305-10: entered promiscuous mode
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.895 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a911305-10, col_values=(('external_ids', {'iface-id': 'b4222228-ac79-43ed-9842-3bc20abbb5e0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_controller[94904]: 2025-10-07T22:12:56Z|00204|binding|INFO|Releasing lport b4222228-ac79-43ed-9842-3bc20abbb5e0 from this chassis (sb_readonly=0)
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.899 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0e17d5bc-f7d3-4063-9c9f-3674a18c8020]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.900 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.900 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.900 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 3a911305-1c41-4e8b-b203-9200b81948ee disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.900 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.901 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[565c5a83-cf8e-476e-95c8-1b3166d1037e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.901 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.902 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b100da03-94e7-47c5-9fec-5e3030508b84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.902 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-3a911305-1c41-4e8b-b203-9200b81948ee
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 3a911305-1c41-4e8b-b203-9200b81948ee
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:12:56 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:12:56.903 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'env', 'PROCESS_TAG=haproxy-3a911305-1c41-4e8b-b203-9200b81948ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a911305-1c41-4e8b-b203-9200b81948ee.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:12:56 compute-0 nova_compute[192716]: 2025-10-07 22:12:56.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:12:57 compute-0 podman[224589]: 2025-10-07 22:12:57.350574369 +0000 UTC m=+0.067169270 container create 6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 22:12:57 compute-0 systemd[1]: Started libpod-conmon-6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512.scope.
Oct 07 22:12:57 compute-0 podman[224589]: 2025-10-07 22:12:57.322025975 +0000 UTC m=+0.038620906 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:12:57 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:12:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16a9d6ed1103ad22af696fc2c0fe93d2a994eee05b668fbd6e79c2428452ec54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:12:57 compute-0 podman[224589]: 2025-10-07 22:12:57.43549227 +0000 UTC m=+0.152087181 container init 6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 22:12:57 compute-0 podman[224589]: 2025-10-07 22:12:57.443829571 +0000 UTC m=+0.160424472 container start 6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:12:57 compute-0 podman[224602]: 2025-10-07 22:12:57.447208809 +0000 UTC m=+0.065918074 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:12:57 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [NOTICE]   (224628) : New worker (224630) forked
Oct 07 22:12:57 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [NOTICE]   (224628) : Loading success.
Oct 07 22:12:59 compute-0 podman[203153]: time="2025-10-07T22:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:12:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:12:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3481 "" "Go-http-client/1.1"
Oct 07 22:13:00 compute-0 ovn_controller[94904]: 2025-10-07T22:13:00Z|00205|binding|INFO|Claiming lport ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 for this chassis.
Oct 07 22:13:00 compute-0 ovn_controller[94904]: 2025-10-07T22:13:00Z|00206|binding|INFO|ce7c20f0-f260-4b47-9e8d-4fd23be68ce8: Claiming fa:16:3e:81:bc:ba 10.100.0.9
Oct 07 22:13:00 compute-0 ovn_controller[94904]: 2025-10-07T22:13:00Z|00207|binding|INFO|Setting lport ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 up in Southbound
Oct 07 22:13:00 compute-0 nova_compute[192716]: 2025-10-07 22:13:00.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: ERROR   22:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: ERROR   22:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: ERROR   22:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: ERROR   22:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: ERROR   22:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:13:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.484 2 INFO nova.compute.manager [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Post operation of migration started
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.485 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.595 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.596 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.657 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-cc1ed00e-556f-40b9-8d60-91367908a213" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.658 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-cc1ed00e-556f-40b9-8d60-91367908a213" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.658 2 DEBUG nova.network.neutron [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:13:01 compute-0 nova_compute[192716]: 2025-10-07 22:13:01.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:01 compute-0 podman[224653]: 2025-10-07 22:13:01.839724752 +0000 UTC m=+0.078270200 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Oct 07 22:13:02 compute-0 nova_compute[192716]: 2025-10-07 22:13:02.165 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:02 compute-0 nova_compute[192716]: 2025-10-07 22:13:02.647 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:02 compute-0 nova_compute[192716]: 2025-10-07 22:13:02.852 2 DEBUG nova.network.neutron [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Updating instance_info_cache with network_info: [{"id": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "address": "fa:16:3e:81:bc:ba", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce7c20f0-f2", "ovs_interfaceid": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:13:03 compute-0 nova_compute[192716]: 2025-10-07 22:13:03.362 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-cc1ed00e-556f-40b9-8d60-91367908a213" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:13:03 compute-0 nova_compute[192716]: 2025-10-07 22:13:03.886 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:03 compute-0 nova_compute[192716]: 2025-10-07 22:13:03.887 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:03 compute-0 nova_compute[192716]: 2025-10-07 22:13:03.887 2 DEBUG oslo_concurrency.lockutils [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:03 compute-0 nova_compute[192716]: 2025-10-07 22:13:03.893 2 INFO nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:13:03 compute-0 virtqemud[192532]: Domain id=17 name='instance-00000016' uuid=cc1ed00e-556f-40b9-8d60-91367908a213 is tainted: custom-monitor
Oct 07 22:13:04 compute-0 nova_compute[192716]: 2025-10-07 22:13:04.901 2 INFO nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:13:05 compute-0 nova_compute[192716]: 2025-10-07 22:13:05.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:05 compute-0 nova_compute[192716]: 2025-10-07 22:13:05.909 2 INFO nova.virt.libvirt.driver [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:13:05 compute-0 nova_compute[192716]: 2025-10-07 22:13:05.914 2 DEBUG nova.compute.manager [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:13:06 compute-0 nova_compute[192716]: 2025-10-07 22:13:06.426 2 DEBUG nova.objects.instance [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:13:06 compute-0 nova_compute[192716]: 2025-10-07 22:13:06.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:07 compute-0 nova_compute[192716]: 2025-10-07 22:13:07.447 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:07 compute-0 nova_compute[192716]: 2025-10-07 22:13:07.519 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:07 compute-0 nova_compute[192716]: 2025-10-07 22:13:07.520 2 WARNING neutronclient.v2_0.client [None req-c06b5389-dc06-470d-827c-1a2998768f6e 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:10 compute-0 nova_compute[192716]: 2025-10-07 22:13:10.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:11 compute-0 nova_compute[192716]: 2025-10-07 22:13:11.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:15 compute-0 nova_compute[192716]: 2025-10-07 22:13:15.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:15 compute-0 podman[224675]: 2025-10-07 22:13:15.830745373 +0000 UTC m=+0.061416904 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4)
Oct 07 22:13:15 compute-0 podman[224676]: 2025-10-07 22:13:15.836281492 +0000 UTC m=+0.067756637 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:13:16 compute-0 nova_compute[192716]: 2025-10-07 22:13:16.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:18 compute-0 podman[224715]: 2025-10-07 22:13:18.848952071 +0000 UTC m=+0.080614288 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.035 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Acquiring lock "cc1ed00e-556f-40b9-8d60-91367908a213" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.035 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.035 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Acquiring lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.035 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.036 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.050 2 INFO nova.compute.manager [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Terminating instance
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.571 2 DEBUG nova.compute.manager [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:13:19 compute-0 kernel: tapce7c20f0-f2 (unregistering): left promiscuous mode
Oct 07 22:13:19 compute-0 NetworkManager[51722]: <info>  [1759875199.5951] device (tapce7c20f0-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 ovn_controller[94904]: 2025-10-07T22:13:19Z|00208|binding|INFO|Releasing lport ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 from this chassis (sb_readonly=0)
Oct 07 22:13:19 compute-0 ovn_controller[94904]: 2025-10-07T22:13:19Z|00209|binding|INFO|Setting lport ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 down in Southbound
Oct 07 22:13:19 compute-0 ovn_controller[94904]: 2025-10-07T22:13:19Z|00210|binding|INFO|Removing iface tapce7c20f0-f2 ovn-installed in OVS
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.618 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:bc:ba 10.100.0.9'], port_security=['fa:16:3e:81:bc:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cc1ed00e-556f-40b9-8d60-91367908a213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5a4169ef6e443b4a1f43aa9ac237c66', 'neutron:revision_number': '15', 'neutron:security_group_ids': '294a90bf-037a-4fa5-8621-501690236cf0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fef08549-83ea-44ba-a65f-44ca77f5b5f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.619 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 in datapath 3a911305-1c41-4e8b-b203-9200b81948ee unbound from our chassis
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.620 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a911305-1c41-4e8b-b203-9200b81948ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.622 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d12f7474-8b8d-4f99-992e-a24a38df183d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.623 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee namespace which is not needed anymore
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Deactivated successfully.
Oct 07 22:13:19 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Consumed 2.889s CPU time.
Oct 07 22:13:19 compute-0 systemd-machined[152719]: Machine qemu-17-instance-00000016 terminated.
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.750 2 DEBUG nova.compute.manager [req-1eb33824-2c76-437b-84d4-cf483c1c68c8 req-4a0d6d7d-02e5-4971-ada7-c68909b8561c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Received event network-vif-unplugged-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.751 2 DEBUG oslo_concurrency.lockutils [req-1eb33824-2c76-437b-84d4-cf483c1c68c8 req-4a0d6d7d-02e5-4971-ada7-c68909b8561c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.751 2 DEBUG oslo_concurrency.lockutils [req-1eb33824-2c76-437b-84d4-cf483c1c68c8 req-4a0d6d7d-02e5-4971-ada7-c68909b8561c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.751 2 DEBUG oslo_concurrency.lockutils [req-1eb33824-2c76-437b-84d4-cf483c1c68c8 req-4a0d6d7d-02e5-4971-ada7-c68909b8561c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.752 2 DEBUG nova.compute.manager [req-1eb33824-2c76-437b-84d4-cf483c1c68c8 req-4a0d6d7d-02e5-4971-ada7-c68909b8561c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] No waiting events found dispatching network-vif-unplugged-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.752 2 DEBUG nova.compute.manager [req-1eb33824-2c76-437b-84d4-cf483c1c68c8 req-4a0d6d7d-02e5-4971-ada7-c68909b8561c 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Received event network-vif-unplugged-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:13:19 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [NOTICE]   (224628) : haproxy version is 3.0.5-8e879a5
Oct 07 22:13:19 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [NOTICE]   (224628) : path to executable is /usr/sbin/haproxy
Oct 07 22:13:19 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [WARNING]  (224628) : Exiting Master process...
Oct 07 22:13:19 compute-0 podman[224764]: 2025-10-07 22:13:19.769266979 +0000 UTC m=+0.030385608 container kill 6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 22:13:19 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [ALERT]    (224628) : Current worker (224630) exited with code 143 (Terminated)
Oct 07 22:13:19 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[224610]: [WARNING]  (224628) : All workers exited. Exiting... (0)
Oct 07 22:13:19 compute-0 systemd[1]: libpod-6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512.scope: Deactivated successfully.
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 podman[224780]: 2025-10-07 22:13:19.826982835 +0000 UTC m=+0.027749622 container died 6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.845 2 INFO nova.virt.libvirt.driver [-] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Instance destroyed successfully.
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.845 2 DEBUG nova.objects.instance [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lazy-loading 'resources' on Instance uuid cc1ed00e-556f-40b9-8d60-91367908a213 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:13:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512-userdata-shm.mount: Deactivated successfully.
Oct 07 22:13:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-16a9d6ed1103ad22af696fc2c0fe93d2a994eee05b668fbd6e79c2428452ec54-merged.mount: Deactivated successfully.
Oct 07 22:13:19 compute-0 podman[224780]: 2025-10-07 22:13:19.875750113 +0000 UTC m=+0.076516870 container remove 6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:13:19 compute-0 systemd[1]: libpod-conmon-6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512.scope: Deactivated successfully.
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.884 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[12588e0e-5a7e-41cb-a94d-273910ea0d6d]: (4, ("Tue Oct  7 10:13:19 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee (6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512)\n6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512\nTue Oct  7 10:13:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee (6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512)\n6b93ce8b12e1b28da7f5ffdf2979a52786f03c1fa7791c6bf1cbcccffa55b512\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.886 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b830b1-a219-476a-8455-fc6d1b2b06af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.887 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.888 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9f38843b-20fb-498b-af5d-587518d431dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.888 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a911305-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:13:19 compute-0 kernel: tap3a911305-10: left promiscuous mode
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 nova_compute[192716]: 2025-10-07 22:13:19.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.910 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2695e3e9-fa50-4f44-b480-2dd92838b241]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.934 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2775c060-3600-48c0-8353-519e7049ae68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.935 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3056b55d-958d-4678-a48a-8b4298c84612]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.950 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a7391c75-a29c-4267-abc5-8ed0d66180a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491022, 'reachable_time': 23937, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224825, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d3a911305\x2d1c41\x2d4e8b\x2db203\x2d9200b81948ee.mount: Deactivated successfully.
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.953 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:13:19 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:19.953 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6cded8-a238-43c6-ab9a-ebccd9c78621]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.354 2 DEBUG nova.virt.libvirt.vif [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:11:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-372835884',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-372835884',id=22,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:12:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5a4169ef6e443b4a1f43aa9ac237c66',ramdisk_id='',reservation_id='r-whnkbozb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:13:06Z,user_data=None,user_id='ce5b7880b5ed459d9196d63a71180641',uuid=cc1ed00e-556f-40b9-8d60-91367908a213,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "address": "fa:16:3e:81:bc:ba", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce7c20f0-f2", "ovs_interfaceid": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.355 2 DEBUG nova.network.os_vif_util [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Converting VIF {"id": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "address": "fa:16:3e:81:bc:ba", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce7c20f0-f2", "ovs_interfaceid": "ce7c20f0-f260-4b47-9e8d-4fd23be68ce8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.356 2 DEBUG nova.network.os_vif_util [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:81:bc:ba,bridge_name='br-int',has_traffic_filtering=True,id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce7c20f0-f2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.356 2 DEBUG os_vif [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:81:bc:ba,bridge_name='br-int',has_traffic_filtering=True,id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce7c20f0-f2') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.360 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce7c20f0-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d37697be-2867-468a-aae2-33d3ec7d157e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.368 2 INFO os_vif [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:81:bc:ba,bridge_name='br-int',has_traffic_filtering=True,id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce7c20f0-f2')
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.369 2 INFO nova.virt.libvirt.driver [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Deleting instance files /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213_del
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.369 2 INFO nova.virt.libvirt.driver [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Deletion of /var/lib/nova/instances/cc1ed00e-556f-40b9-8d60-91367908a213_del complete
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.882 2 INFO nova.compute.manager [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.883 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.883 2 DEBUG nova.compute.manager [-] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.883 2 DEBUG nova.network.neutron [-] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:13:20 compute-0 nova_compute[192716]: 2025-10-07 22:13:20.883 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.326 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.695 2 DEBUG nova.compute.manager [req-779959f6-8ee8-4cc2-901e-0e3e7e27c68a req-c51829ce-46c6-43b3-8c26-f20ebf8edf99 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Received event network-vif-deleted-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.696 2 INFO nova.compute.manager [req-779959f6-8ee8-4cc2-901e-0e3e7e27c68a req-c51829ce-46c6-43b3-8c26-f20ebf8edf99 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Neutron deleted interface ce7c20f0-f260-4b47-9e8d-4fd23be68ce8; detaching it from the instance and deleting it from the info cache
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.696 2 DEBUG nova.network.neutron [req-779959f6-8ee8-4cc2-901e-0e3e7e27c68a req-c51829ce-46c6-43b3-8c26-f20ebf8edf99 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.812 2 DEBUG nova.compute.manager [req-073f073f-fe53-49bd-9d65-e9e249278dd8 req-0ccb0767-de6c-44ba-9bc1-d695087921da 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Received event network-vif-unplugged-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.812 2 DEBUG oslo_concurrency.lockutils [req-073f073f-fe53-49bd-9d65-e9e249278dd8 req-0ccb0767-de6c-44ba-9bc1-d695087921da 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.813 2 DEBUG oslo_concurrency.lockutils [req-073f073f-fe53-49bd-9d65-e9e249278dd8 req-0ccb0767-de6c-44ba-9bc1-d695087921da 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.813 2 DEBUG oslo_concurrency.lockutils [req-073f073f-fe53-49bd-9d65-e9e249278dd8 req-0ccb0767-de6c-44ba-9bc1-d695087921da 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.814 2 DEBUG nova.compute.manager [req-073f073f-fe53-49bd-9d65-e9e249278dd8 req-0ccb0767-de6c-44ba-9bc1-d695087921da 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] No waiting events found dispatching network-vif-unplugged-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:13:21 compute-0 nova_compute[192716]: 2025-10-07 22:13:21.814 2 DEBUG nova.compute.manager [req-073f073f-fe53-49bd-9d65-e9e249278dd8 req-0ccb0767-de6c-44ba-9bc1-d695087921da 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Received event network-vif-unplugged-ce7c20f0-f260-4b47-9e8d-4fd23be68ce8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:13:22 compute-0 nova_compute[192716]: 2025-10-07 22:13:22.123 2 DEBUG nova.network.neutron [-] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:13:22 compute-0 nova_compute[192716]: 2025-10-07 22:13:22.203 2 DEBUG nova.compute.manager [req-779959f6-8ee8-4cc2-901e-0e3e7e27c68a req-c51829ce-46c6-43b3-8c26-f20ebf8edf99 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Detach interface failed, port_id=ce7c20f0-f260-4b47-9e8d-4fd23be68ce8, reason: Instance cc1ed00e-556f-40b9-8d60-91367908a213 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:13:22 compute-0 nova_compute[192716]: 2025-10-07 22:13:22.631 2 INFO nova.compute.manager [-] [instance: cc1ed00e-556f-40b9-8d60-91367908a213] Took 1.75 seconds to deallocate network for instance.
Oct 07 22:13:23 compute-0 nova_compute[192716]: 2025-10-07 22:13:23.164 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:23 compute-0 nova_compute[192716]: 2025-10-07 22:13:23.165 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:23 compute-0 nova_compute[192716]: 2025-10-07 22:13:23.172 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:23 compute-0 nova_compute[192716]: 2025-10-07 22:13:23.254 2 INFO nova.scheduler.client.report [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Deleted allocations for instance cc1ed00e-556f-40b9-8d60-91367908a213
Oct 07 22:13:24 compute-0 nova_compute[192716]: 2025-10-07 22:13:24.287 2 DEBUG oslo_concurrency.lockutils [None req-ea12ebca-a410-4815-afe8-37b344ea12b1 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "cc1ed00e-556f-40b9-8d60-91367908a213" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.252s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:25 compute-0 nova_compute[192716]: 2025-10-07 22:13:25.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:25 compute-0 nova_compute[192716]: 2025-10-07 22:13:25.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:25.650 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:25.650 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:25.650 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:26 compute-0 podman[224827]: 2025-10-07 22:13:26.899561534 +0000 UTC m=+0.128728627 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251007)
Oct 07 22:13:27 compute-0 podman[224854]: 2025-10-07 22:13:27.825141494 +0000 UTC m=+0.065336277 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 07 22:13:29 compute-0 podman[203153]: time="2025-10-07T22:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:13:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:13:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:13:30 compute-0 nova_compute[192716]: 2025-10-07 22:13:30.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:30 compute-0 nova_compute[192716]: 2025-10-07 22:13:30.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: ERROR   22:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: ERROR   22:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: ERROR   22:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: ERROR   22:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: ERROR   22:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:13:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:13:32 compute-0 podman[224874]: 2025-10-07 22:13:32.82409569 +0000 UTC m=+0.063873795 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 07 22:13:34 compute-0 nova_compute[192716]: 2025-10-07 22:13:34.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:34 compute-0 nova_compute[192716]: 2025-10-07 22:13:34.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:13:35 compute-0 nova_compute[192716]: 2025-10-07 22:13:35.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:35 compute-0 nova_compute[192716]: 2025-10-07 22:13:35.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:35 compute-0 nova_compute[192716]: 2025-10-07 22:13:35.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:38 compute-0 nova_compute[192716]: 2025-10-07 22:13:38.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:40 compute-0 nova_compute[192716]: 2025-10-07 22:13:40.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:40 compute-0 nova_compute[192716]: 2025-10-07 22:13:40.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:43 compute-0 nova_compute[192716]: 2025-10-07 22:13:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:43 compute-0 nova_compute[192716]: 2025-10-07 22:13:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.505 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.709 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.711 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.755 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.755 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=73.30291366577148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.756 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:13:44 compute-0 nova_compute[192716]: 2025-10-07 22:13:44.756 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:13:45 compute-0 nova_compute[192716]: 2025-10-07 22:13:45.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:45 compute-0 nova_compute[192716]: 2025-10-07 22:13:45.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:45 compute-0 nova_compute[192716]: 2025-10-07 22:13:45.823 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:13:45 compute-0 nova_compute[192716]: 2025-10-07 22:13:45.823 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:13:44 up  1:22,  0 user,  load average: 0.13, 0.18, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:13:45 compute-0 nova_compute[192716]: 2025-10-07 22:13:45.909 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:13:46 compute-0 nova_compute[192716]: 2025-10-07 22:13:46.417 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:13:46 compute-0 podman[224899]: 2025-10-07 22:13:46.841746098 +0000 UTC m=+0.076373445 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:13:46 compute-0 podman[224900]: 2025-10-07 22:13:46.874606607 +0000 UTC m=+0.104880379 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 22:13:46 compute-0 nova_compute[192716]: 2025-10-07 22:13:46.927 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:13:46 compute-0 nova_compute[192716]: 2025-10-07 22:13:46.927 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.171s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:13:49 compute-0 podman[224939]: 2025-10-07 22:13:49.835375297 +0000 UTC m=+0.073796671 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:13:49 compute-0 nova_compute[192716]: 2025-10-07 22:13:49.923 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:50 compute-0 nova_compute[192716]: 2025-10-07 22:13:50.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:50 compute-0 nova_compute[192716]: 2025-10-07 22:13:50.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:50 compute-0 nova_compute[192716]: 2025-10-07 22:13:50.434 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:50 compute-0 nova_compute[192716]: 2025-10-07 22:13:50.434 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:51 compute-0 nova_compute[192716]: 2025-10-07 22:13:51.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:13:55 compute-0 nova_compute[192716]: 2025-10-07 22:13:55.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:55 compute-0 nova_compute[192716]: 2025-10-07 22:13:55.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:57 compute-0 nova_compute[192716]: 2025-10-07 22:13:57.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:13:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:57.369 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:13:57 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:13:57.370 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:13:57 compute-0 podman[224964]: 2025-10-07 22:13:57.86516965 +0000 UTC m=+0.100101220 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 22:13:57 compute-0 podman[224991]: 2025-10-07 22:13:57.951524073 +0000 UTC m=+0.051252941 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 07 22:13:59 compute-0 podman[203153]: time="2025-10-07T22:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:13:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:13:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:14:00 compute-0 nova_compute[192716]: 2025-10-07 22:14:00.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:00 compute-0 nova_compute[192716]: 2025-10-07 22:14:00.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:01 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:01.372 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: ERROR   22:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: ERROR   22:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: ERROR   22:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: ERROR   22:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: ERROR   22:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:14:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:14:03 compute-0 podman[225010]: 2025-10-07 22:14:03.819981513 +0000 UTC m=+0.057335326 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 07 22:14:05 compute-0 nova_compute[192716]: 2025-10-07 22:14:05.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:05 compute-0 nova_compute[192716]: 2025-10-07 22:14:05.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:10 compute-0 nova_compute[192716]: 2025-10-07 22:14:10.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:10 compute-0 nova_compute[192716]: 2025-10-07 22:14:10.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:15 compute-0 nova_compute[192716]: 2025-10-07 22:14:15.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:15 compute-0 nova_compute[192716]: 2025-10-07 22:14:15.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:17 compute-0 podman[225032]: 2025-10-07 22:14:17.830115065 +0000 UTC m=+0.066517162 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 07 22:14:17 compute-0 podman[225033]: 2025-10-07 22:14:17.851365148 +0000 UTC m=+0.074684397 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Oct 07 22:14:20 compute-0 nova_compute[192716]: 2025-10-07 22:14:20.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:20 compute-0 nova_compute[192716]: 2025-10-07 22:14:20.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:20 compute-0 podman[225073]: 2025-10-07 22:14:20.82743129 +0000 UTC m=+0.059943081 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:14:24 compute-0 nova_compute[192716]: 2025-10-07 22:14:24.350 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Creating tmpfile /var/lib/nova/instances/tmpsbma5bt5 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:14:24 compute-0 nova_compute[192716]: 2025-10-07 22:14:24.351 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:24 compute-0 nova_compute[192716]: 2025-10-07 22:14:24.357 2 DEBUG nova.compute.manager [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsbma5bt5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:14:25 compute-0 nova_compute[192716]: 2025-10-07 22:14:25.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:25 compute-0 nova_compute[192716]: 2025-10-07 22:14:25.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:25.651 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:14:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:25.652 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:14:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:25.652 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:14:26 compute-0 nova_compute[192716]: 2025-10-07 22:14:26.416 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:28 compute-0 podman[225099]: 2025-10-07 22:14:28.875957993 +0000 UTC m=+0.102722105 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 07 22:14:28 compute-0 podman[225098]: 2025-10-07 22:14:28.876016945 +0000 UTC m=+0.109754039 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct 07 22:14:29 compute-0 podman[203153]: time="2025-10-07T22:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:14:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:14:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:14:30 compute-0 nova_compute[192716]: 2025-10-07 22:14:30.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:30 compute-0 nova_compute[192716]: 2025-10-07 22:14:30.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:30 compute-0 nova_compute[192716]: 2025-10-07 22:14:30.872 2 DEBUG nova.compute.manager [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsbma5bt5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='14ab7182-5946-4a3f-854e-62598e86e9ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: ERROR   22:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: ERROR   22:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: ERROR   22:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: ERROR   22:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: ERROR   22:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:14:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:14:31 compute-0 nova_compute[192716]: 2025-10-07 22:14:31.889 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-14ab7182-5946-4a3f-854e-62598e86e9ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:14:31 compute-0 nova_compute[192716]: 2025-10-07 22:14:31.889 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-14ab7182-5946-4a3f-854e-62598e86e9ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:14:31 compute-0 nova_compute[192716]: 2025-10-07 22:14:31.889 2 DEBUG nova.network.neutron [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:14:32 compute-0 nova_compute[192716]: 2025-10-07 22:14:32.400 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:34 compute-0 nova_compute[192716]: 2025-10-07 22:14:34.492 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:34 compute-0 nova_compute[192716]: 2025-10-07 22:14:34.791 2 DEBUG nova.network.neutron [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Updating instance_info_cache with network_info: [{"id": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "address": "fa:16:3e:df:fb:43", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa37754ca-59", "ovs_interfaceid": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:14:34 compute-0 podman[225143]: 2025-10-07 22:14:34.82435825 +0000 UTC m=+0.061761464 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 07 22:14:34 compute-0 nova_compute[192716]: 2025-10-07 22:14:34.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:34 compute-0 nova_compute[192716]: 2025-10-07 22:14:34.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.298 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-14ab7182-5946-4a3f-854e-62598e86e9ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.317 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsbma5bt5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='14ab7182-5946-4a3f-854e-62598e86e9ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.318 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Creating instance directory: /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.319 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Creating disk.info with the contents: {'/var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk': 'qcow2', '/var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.320 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.321 2 DEBUG nova.objects.instance [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 14ab7182-5946-4a3f-854e-62598e86e9ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.829 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.834 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.838 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.914 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.915 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.915 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.916 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.920 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.920 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.975 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:35 compute-0 nova_compute[192716]: 2025-10-07 22:14:35.976 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.024 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.025 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.026 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.103 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.105 2 DEBUG nova.virt.disk.api [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.105 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.192 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.194 2 DEBUG nova.virt.disk.api [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.194 2 DEBUG nova.objects.instance [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 14ab7182-5946-4a3f-854e-62598e86e9ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:14:36 compute-0 ovn_controller[94904]: 2025-10-07T22:14:36Z|00211|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.702 2 DEBUG nova.objects.base [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<14ab7182-5946-4a3f-854e-62598e86e9ee> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.703 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.731 2 DEBUG oslo_concurrency.processutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk.config 497664" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.733 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.735 2 DEBUG nova.virt.libvirt.vif [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:13:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1016137733',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1016137733',id=24,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:13:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d5a4169ef6e443b4a1f43aa9ac237c66',ramdisk_id='',reservation_id='r-rx7q1i29',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:13:44Z,user_data=None,user_id='ce5b7880b5ed459d9196d63a71180641',uuid=14ab7182-5946-4a3f-854e-62598e86e9ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "address": "fa:16:3e:df:fb:43", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa37754ca-59", "ovs_interfaceid": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.736 2 DEBUG nova.network.os_vif_util [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "address": "fa:16:3e:df:fb:43", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa37754ca-59", "ovs_interfaceid": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.737 2 DEBUG nova.network.os_vif_util [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:fb:43,bridge_name='br-int',has_traffic_filtering=True,id=a37754ca-59c3-4b33-94b2-fc7aec552c76,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa37754ca-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.738 2 DEBUG os_vif [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:fb:43,bridge_name='br-int',has_traffic_filtering=True,id=a37754ca-59c3-4b33-94b2-fc7aec552c76,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa37754ca-59') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.741 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.742 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '834bd47c-2da2-595a-8d6a-429d2e90cfd4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.750 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa37754ca-59, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.751 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa37754ca-59, col_values=(('qos', UUID('5e710ca2-8b40-4372-8587-004762f63cbf')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.751 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa37754ca-59, col_values=(('external_ids', {'iface-id': 'a37754ca-59c3-4b33-94b2-fc7aec552c76', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:fb:43', 'vm-uuid': '14ab7182-5946-4a3f-854e-62598e86e9ee'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 NetworkManager[51722]: <info>  [1759875276.7539] manager: (tapa37754ca-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.759 2 INFO os_vif [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:fb:43,bridge_name='br-int',has_traffic_filtering=True,id=a37754ca-59c3-4b33-94b2-fc7aec552c76,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa37754ca-59')
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.759 2 DEBUG nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.760 2 DEBUG nova.compute.manager [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsbma5bt5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='14ab7182-5946-4a3f-854e-62598e86e9ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.760 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:36 compute-0 nova_compute[192716]: 2025-10-07 22:14:36.870 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:37 compute-0 nova_compute[192716]: 2025-10-07 22:14:37.837 2 DEBUG nova.network.neutron [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Port a37754ca-59c3-4b33-94b2-fc7aec552c76 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:14:37 compute-0 nova_compute[192716]: 2025-10-07 22:14:37.856 2 DEBUG nova.compute.manager [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsbma5bt5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='14ab7182-5946-4a3f-854e-62598e86e9ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:14:37 compute-0 nova_compute[192716]: 2025-10-07 22:14:37.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:39 compute-0 nova_compute[192716]: 2025-10-07 22:14:39.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:40 compute-0 nova_compute[192716]: 2025-10-07 22:14:40.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:41 compute-0 kernel: tapa37754ca-59: entered promiscuous mode
Oct 07 22:14:41 compute-0 NetworkManager[51722]: <info>  [1759875281.3513] manager: (tapa37754ca-59): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Oct 07 22:14:41 compute-0 ovn_controller[94904]: 2025-10-07T22:14:41Z|00212|binding|INFO|Claiming lport a37754ca-59c3-4b33-94b2-fc7aec552c76 for this additional chassis.
Oct 07 22:14:41 compute-0 ovn_controller[94904]: 2025-10-07T22:14:41Z|00213|binding|INFO|a37754ca-59c3-4b33-94b2-fc7aec552c76: Claiming fa:16:3e:df:fb:43 10.100.0.8
Oct 07 22:14:41 compute-0 nova_compute[192716]: 2025-10-07 22:14:41.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.364 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:fb:43 10.100.0.8'], port_security=['fa:16:3e:df:fb:43 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '14ab7182-5946-4a3f-854e-62598e86e9ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5a4169ef6e443b4a1f43aa9ac237c66', 'neutron:revision_number': '10', 'neutron:security_group_ids': '294a90bf-037a-4fa5-8621-501690236cf0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fef08549-83ea-44ba-a65f-44ca77f5b5f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a37754ca-59c3-4b33-94b2-fc7aec552c76) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.365 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a37754ca-59c3-4b33-94b2-fc7aec552c76 in datapath 3a911305-1c41-4e8b-b203-9200b81948ee unbound from our chassis
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.367 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a911305-1c41-4e8b-b203-9200b81948ee
Oct 07 22:14:41 compute-0 ovn_controller[94904]: 2025-10-07T22:14:41Z|00214|binding|INFO|Setting lport a37754ca-59c3-4b33-94b2-fc7aec552c76 ovn-installed in OVS
Oct 07 22:14:41 compute-0 nova_compute[192716]: 2025-10-07 22:14:41.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:41 compute-0 nova_compute[192716]: 2025-10-07 22:14:41.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.380 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2d646361-eb54-44cf-8c04-a1a6c2cd64b7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.381 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a911305-11 in ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.383 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a911305-10 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.383 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d388d2aa-bdf0-4ef5-9e51-90227c605a89]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.384 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d101164c-cae8-4cf3-969b-42f2eacdb90a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.394 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[408e6182-4f39-45bd-bd98-d50d05cbb6a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 systemd-machined[152719]: New machine qemu-18-instance-00000018.
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.410 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[19fd5fcc-5fa9-407f-a289-4db8dbc87cd1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000018.
Oct 07 22:14:41 compute-0 systemd-udevd[225203]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:14:41 compute-0 NetworkManager[51722]: <info>  [1759875281.4369] device (tapa37754ca-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:14:41 compute-0 NetworkManager[51722]: <info>  [1759875281.4385] device (tapa37754ca-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.453 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[722d70b5-9eb8-49e5-b15f-dfc6e60c2c5d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.458 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[748d5c01-39c3-45ae-a285-ac07c6d3dcc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 NetworkManager[51722]: <info>  [1759875281.4606] manager: (tap3a911305-10): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.485 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[313ee97a-c609-408d-bdfe-148791b252ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.489 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[324e3831-fb62-4365-9093-95714b44b169]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 NetworkManager[51722]: <info>  [1759875281.5205] device (tap3a911305-10): carrier: link connected
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.528 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[38be72ac-b116-4b49-857e-2c1682f1f8e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.551 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e54deb-3051-4d62-af0a-da53bad6e100]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a911305-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d8:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501510, 'reachable_time': 42802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225232, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.574 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[dee57d00-2cac-41ee-a304-922a358c8aff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:d854'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501510, 'tstamp': 501510}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225233, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.591 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc3e435-bc8a-48f3-9c8f-a21b1cca4e77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a911305-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d8:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501510, 'reachable_time': 42802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225234, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.629 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e93e7f9e-bfd6-4086-aee6-8d0d7cfe5555]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.705 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0f8b66-ab70-46be-b326-f13692f90a4c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.706 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a911305-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.706 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.707 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a911305-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:41 compute-0 NetworkManager[51722]: <info>  [1759875281.7106] manager: (tap3a911305-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Oct 07 22:14:41 compute-0 kernel: tap3a911305-10: entered promiscuous mode
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.714 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a911305-10, col_values=(('external_ids', {'iface-id': 'b4222228-ac79-43ed-9842-3bc20abbb5e0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:14:41 compute-0 ovn_controller[94904]: 2025-10-07T22:14:41Z|00215|binding|INFO|Releasing lport b4222228-ac79-43ed-9842-3bc20abbb5e0 from this chassis (sb_readonly=0)
Oct 07 22:14:41 compute-0 nova_compute[192716]: 2025-10-07 22:14:41.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.737 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bf80ae34-6d45-4702-aa6e-9c96fd585819]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.738 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.738 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.738 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 3a911305-1c41-4e8b-b203-9200b81948ee disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.738 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:14:41 compute-0 nova_compute[192716]: 2025-10-07 22:14:41.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.740 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ed85c647-e8cb-4a1c-91c1-0241679fec95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.740 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.740 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[14fd2f8f-3f81-4119-b183-b9e2cb4eb1a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.741 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-3a911305-1c41-4e8b-b203-9200b81948ee
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 3a911305-1c41-4e8b-b203-9200b81948ee
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:14:41 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:14:41.743 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'env', 'PROCESS_TAG=haproxy-3a911305-1c41-4e8b-b203-9200b81948ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a911305-1c41-4e8b-b203-9200b81948ee.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:14:41 compute-0 nova_compute[192716]: 2025-10-07 22:14:41.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:42 compute-0 podman[225273]: 2025-10-07 22:14:42.198551596 +0000 UTC m=+0.061416544 container create 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 22:14:42 compute-0 systemd[1]: Started libpod-conmon-17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784.scope.
Oct 07 22:14:42 compute-0 podman[225273]: 2025-10-07 22:14:42.168977702 +0000 UTC m=+0.031842680 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:14:42 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:14:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5860b4eb9d0e92732b6d7cfa62ce4b922ef90ad82e0238e36761e30d9b7f101/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:14:42 compute-0 podman[225273]: 2025-10-07 22:14:42.283292632 +0000 UTC m=+0.146157670 container init 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:14:42 compute-0 podman[225273]: 2025-10-07 22:14:42.293760684 +0000 UTC m=+0.156625662 container start 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:14:42 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [NOTICE]   (225292) : New worker (225294) forked
Oct 07 22:14:42 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [NOTICE]   (225292) : Loading success.
Oct 07 22:14:44 compute-0 nova_compute[192716]: 2025-10-07 22:14:44.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:45 compute-0 ovn_controller[94904]: 2025-10-07T22:14:45Z|00216|binding|INFO|Claiming lport a37754ca-59c3-4b33-94b2-fc7aec552c76 for this chassis.
Oct 07 22:14:45 compute-0 ovn_controller[94904]: 2025-10-07T22:14:45Z|00217|binding|INFO|a37754ca-59c3-4b33-94b2-fc7aec552c76: Claiming fa:16:3e:df:fb:43 10.100.0.8
Oct 07 22:14:45 compute-0 ovn_controller[94904]: 2025-10-07T22:14:45Z|00218|binding|INFO|Setting lport a37754ca-59c3-4b33-94b2-fc7aec552c76 up in Southbound
Oct 07 22:14:45 compute-0 nova_compute[192716]: 2025-10-07 22:14:45.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:45 compute-0 nova_compute[192716]: 2025-10-07 22:14:45.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:14:45 compute-0 nova_compute[192716]: 2025-10-07 22:14:45.510 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:14:45 compute-0 nova_compute[192716]: 2025-10-07 22:14:45.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:14:45 compute-0 nova_compute[192716]: 2025-10-07 22:14:45.511 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.166 2 INFO nova.compute.manager [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Post operation of migration started
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.167 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.444 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.445 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.563 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.649 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.651 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.741 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.957 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.959 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.979 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.980 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5581MB free_disk=73.26981353759766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.981 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:14:46 compute-0 nova_compute[192716]: 2025-10-07 22:14:46.981 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:14:47 compute-0 nova_compute[192716]: 2025-10-07 22:14:47.419 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-14ab7182-5946-4a3f-854e-62598e86e9ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:14:47 compute-0 nova_compute[192716]: 2025-10-07 22:14:47.420 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-14ab7182-5946-4a3f-854e-62598e86e9ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:14:47 compute-0 nova_compute[192716]: 2025-10-07 22:14:47.421 2 DEBUG nova.network.neutron [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:14:47 compute-0 nova_compute[192716]: 2025-10-07 22:14:47.928 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:48 compute-0 nova_compute[192716]: 2025-10-07 22:14:48.000 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration for instance 14ab7182-5946-4a3f-854e-62598e86e9ee refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 22:14:48 compute-0 nova_compute[192716]: 2025-10-07 22:14:48.510 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:48 compute-0 nova_compute[192716]: 2025-10-07 22:14:48.513 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Updating resource usage from migration c206cb10-6532-4581-9333-f8d740d7c507
Oct 07 22:14:48 compute-0 nova_compute[192716]: 2025-10-07 22:14:48.513 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Starting to track incoming migration c206cb10-6532-4581-9333-f8d740d7c507 with flavor afb8956e-0de1-4ca9-90e4-e4702e3eee79 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 22:14:48 compute-0 podman[225321]: 2025-10-07 22:14:48.848650084 +0000 UTC m=+0.083555266 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid)
Oct 07 22:14:48 compute-0 podman[225322]: 2025-10-07 22:14:48.865145446 +0000 UTC m=+0.087929133 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=watcher_latest)
Oct 07 22:14:49 compute-0 nova_compute[192716]: 2025-10-07 22:14:49.580 2 DEBUG nova.network.neutron [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Updating instance_info_cache with network_info: [{"id": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "address": "fa:16:3e:df:fb:43", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa37754ca-59", "ovs_interfaceid": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:14:49 compute-0 nova_compute[192716]: 2025-10-07 22:14:49.591 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 14ab7182-5946-4a3f-854e-62598e86e9ee has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}.
Oct 07 22:14:49 compute-0 nova_compute[192716]: 2025-10-07 22:14:49.592 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:14:49 compute-0 nova_compute[192716]: 2025-10-07 22:14:49.592 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:14:46 up  1:23,  0 user,  load average: 0.13, 0.16, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:14:49 compute-0 nova_compute[192716]: 2025-10-07 22:14:49.706 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.089 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-14ab7182-5946-4a3f-854e-62598e86e9ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.214 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.612 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.724 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.724 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.743s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.724 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.725 2 DEBUG oslo_concurrency.lockutils [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:14:50 compute-0 nova_compute[192716]: 2025-10-07 22:14:50.729 2 INFO nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:14:50 compute-0 virtqemud[192532]: Domain id=18 name='instance-00000018' uuid=14ab7182-5946-4a3f-854e-62598e86e9ee is tainted: custom-monitor
Oct 07 22:14:51 compute-0 nova_compute[192716]: 2025-10-07 22:14:51.726 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:51 compute-0 nova_compute[192716]: 2025-10-07 22:14:51.726 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:51 compute-0 nova_compute[192716]: 2025-10-07 22:14:51.727 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:51 compute-0 nova_compute[192716]: 2025-10-07 22:14:51.737 2 INFO nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:14:51 compute-0 nova_compute[192716]: 2025-10-07 22:14:51.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:51 compute-0 podman[225361]: 2025-10-07 22:14:51.832711223 +0000 UTC m=+0.068093133 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:14:52 compute-0 nova_compute[192716]: 2025-10-07 22:14:52.744 2 INFO nova.virt.libvirt.driver [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:14:52 compute-0 nova_compute[192716]: 2025-10-07 22:14:52.749 2 DEBUG nova.compute.manager [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:14:53 compute-0 nova_compute[192716]: 2025-10-07 22:14:53.259 2 DEBUG nova.objects.instance [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:14:53 compute-0 nova_compute[192716]: 2025-10-07 22:14:53.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:14:54 compute-0 nova_compute[192716]: 2025-10-07 22:14:54.287 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:54 compute-0 nova_compute[192716]: 2025-10-07 22:14:54.423 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:54 compute-0 nova_compute[192716]: 2025-10-07 22:14:54.423 2 WARNING neutronclient.v2_0.client [None req-ff4ebdc2-5bc2-4bc2-8e19-5ad224c142a4 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:14:55 compute-0 nova_compute[192716]: 2025-10-07 22:14:55.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:56 compute-0 nova_compute[192716]: 2025-10-07 22:14:56.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:14:59 compute-0 podman[203153]: time="2025-10-07T22:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:14:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:14:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3489 "" "Go-http-client/1.1"
Oct 07 22:14:59 compute-0 podman[225386]: 2025-10-07 22:14:59.839193948 +0000 UTC m=+0.068678250 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 07 22:14:59 compute-0 podman[225385]: 2025-10-07 22:14:59.892076185 +0000 UTC m=+0.132532868 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 22:15:00 compute-0 nova_compute[192716]: 2025-10-07 22:15:00.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:01 compute-0 nova_compute[192716]: 2025-10-07 22:15:01.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:01 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:01.135 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:15:01 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:01.136 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:15:01 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:01.137 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: ERROR   22:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: ERROR   22:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: ERROR   22:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: ERROR   22:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: ERROR   22:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:15:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:15:01 compute-0 nova_compute[192716]: 2025-10-07 22:15:01.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:05 compute-0 nova_compute[192716]: 2025-10-07 22:15:05.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:05 compute-0 podman[225428]: 2025-10-07 22:15:05.850567217 +0000 UTC m=+0.083176824 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, version=9.6, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 07 22:15:06 compute-0 nova_compute[192716]: 2025-10-07 22:15:06.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:07 compute-0 nova_compute[192716]: 2025-10-07 22:15:07.488 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Acquiring lock "14ab7182-5946-4a3f-854e-62598e86e9ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:07 compute-0 nova_compute[192716]: 2025-10-07 22:15:07.489 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:07 compute-0 nova_compute[192716]: 2025-10-07 22:15:07.489 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Acquiring lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:07 compute-0 nova_compute[192716]: 2025-10-07 22:15:07.490 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:07 compute-0 nova_compute[192716]: 2025-10-07 22:15:07.490 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:07 compute-0 nova_compute[192716]: 2025-10-07 22:15:07.503 2 INFO nova.compute.manager [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Terminating instance
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.021 2 DEBUG nova.compute.manager [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:15:08 compute-0 kernel: tapa37754ca-59 (unregistering): left promiscuous mode
Oct 07 22:15:08 compute-0 NetworkManager[51722]: <info>  [1759875308.0456] device (tapa37754ca-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:15:08 compute-0 ovn_controller[94904]: 2025-10-07T22:15:08Z|00219|binding|INFO|Releasing lport a37754ca-59c3-4b33-94b2-fc7aec552c76 from this chassis (sb_readonly=0)
Oct 07 22:15:08 compute-0 ovn_controller[94904]: 2025-10-07T22:15:08Z|00220|binding|INFO|Setting lport a37754ca-59c3-4b33-94b2-fc7aec552c76 down in Southbound
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 ovn_controller[94904]: 2025-10-07T22:15:08Z|00221|binding|INFO|Removing iface tapa37754ca-59 ovn-installed in OVS
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.068 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:fb:43 10.100.0.8'], port_security=['fa:16:3e:df:fb:43 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '14ab7182-5946-4a3f-854e-62598e86e9ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a911305-1c41-4e8b-b203-9200b81948ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5a4169ef6e443b4a1f43aa9ac237c66', 'neutron:revision_number': '15', 'neutron:security_group_ids': '294a90bf-037a-4fa5-8621-501690236cf0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fef08549-83ea-44ba-a65f-44ca77f5b5f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a37754ca-59c3-4b33-94b2-fc7aec552c76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.070 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a37754ca-59c3-4b33-94b2-fc7aec552c76 in datapath 3a911305-1c41-4e8b-b203-9200b81948ee unbound from our chassis
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.071 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a911305-1c41-4e8b-b203-9200b81948ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.072 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[de639836-bf60-4eca-8d80-a73dd31ba67f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.073 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee namespace which is not needed anymore
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Deactivated successfully.
Oct 07 22:15:08 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Consumed 2.815s CPU time.
Oct 07 22:15:08 compute-0 systemd-machined[152719]: Machine qemu-18-instance-00000018 terminated.
Oct 07 22:15:08 compute-0 podman[225475]: 2025-10-07 22:15:08.274262645 +0000 UTC m=+0.054797274 container kill 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, tcib_managed=true)
Oct 07 22:15:08 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [NOTICE]   (225292) : haproxy version is 3.0.5-8e879a5
Oct 07 22:15:08 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [NOTICE]   (225292) : path to executable is /usr/sbin/haproxy
Oct 07 22:15:08 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [WARNING]  (225292) : Exiting Master process...
Oct 07 22:15:08 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [ALERT]    (225292) : Current worker (225294) exited with code 143 (Terminated)
Oct 07 22:15:08 compute-0 neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee[225288]: [WARNING]  (225292) : All workers exited. Exiting... (0)
Oct 07 22:15:08 compute-0 systemd[1]: libpod-17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784.scope: Deactivated successfully.
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.306 2 INFO nova.virt.libvirt.driver [-] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Instance destroyed successfully.
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.307 2 DEBUG nova.objects.instance [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lazy-loading 'resources' on Instance uuid 14ab7182-5946-4a3f-854e-62598e86e9ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:15:08 compute-0 podman[225502]: 2025-10-07 22:15:08.348792646 +0000 UTC m=+0.044197794 container died 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:15:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784-userdata-shm.mount: Deactivated successfully.
Oct 07 22:15:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-e5860b4eb9d0e92732b6d7cfa62ce4b922ef90ad82e0238e36761e30d9b7f101-merged.mount: Deactivated successfully.
Oct 07 22:15:08 compute-0 podman[225502]: 2025-10-07 22:15:08.394387479 +0000 UTC m=+0.089792587 container cleanup 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 22:15:08 compute-0 systemd[1]: libpod-conmon-17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784.scope: Deactivated successfully.
Oct 07 22:15:08 compute-0 podman[225506]: 2025-10-07 22:15:08.415354072 +0000 UTC m=+0.093061043 container remove 17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.432 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c67a9975-f7d0-41ce-9361-a8dfc7017900]: (4, ("Tue Oct  7 10:15:08 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee (17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784)\n17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784\nTue Oct  7 10:15:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee (17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784)\n17bcc7890054377cd226be17c7900309cb2e4fe1b24b5ba83c07a8ed5ddb3784\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.434 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[44367372-d557-45c5-873d-826bba218dcb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.435 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a911305-1c41-4e8b-b203-9200b81948ee.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.436 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[30f37a27-b864-43bb-9ff9-f9bf335e671d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.437 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a911305-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 kernel: tap3a911305-10: left promiscuous mode
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.471 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7b768cc5-34f7-40b2-b890-2a78e9400f9d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.509 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d7070335-d749-456b-83bf-48c68d687998]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.510 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[71cdceff-f730-4c64-9706-e8e6bde36166]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.522 2 DEBUG nova.compute.manager [req-9e92ac1f-1ae7-4d31-8bf5-d132864343ae req-c5fa84cb-37e0-4354-88b1-088392025c82 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Received event network-vif-unplugged-a37754ca-59c3-4b33-94b2-fc7aec552c76 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.523 2 DEBUG oslo_concurrency.lockutils [req-9e92ac1f-1ae7-4d31-8bf5-d132864343ae req-c5fa84cb-37e0-4354-88b1-088392025c82 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.523 2 DEBUG oslo_concurrency.lockutils [req-9e92ac1f-1ae7-4d31-8bf5-d132864343ae req-c5fa84cb-37e0-4354-88b1-088392025c82 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.524 2 DEBUG oslo_concurrency.lockutils [req-9e92ac1f-1ae7-4d31-8bf5-d132864343ae req-c5fa84cb-37e0-4354-88b1-088392025c82 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.524 2 DEBUG nova.compute.manager [req-9e92ac1f-1ae7-4d31-8bf5-d132864343ae req-c5fa84cb-37e0-4354-88b1-088392025c82 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] No waiting events found dispatching network-vif-unplugged-a37754ca-59c3-4b33-94b2-fc7aec552c76 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.525 2 DEBUG nova.compute.manager [req-9e92ac1f-1ae7-4d31-8bf5-d132864343ae req-c5fa84cb-37e0-4354-88b1-088392025c82 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Received event network-vif-unplugged-a37754ca-59c3-4b33-94b2-fc7aec552c76 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.534 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[84f3a686-944b-4b26-ac1b-1f81eb6737d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501502, 'reachable_time': 27524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225539, 'error': None, 'target': 'ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.537 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a911305-1c41-4e8b-b203-9200b81948ee deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:15:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:08.537 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[137816f6-95e4-454c-90ac-15bcba6f02c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d3a911305\x2d1c41\x2d4e8b\x2db203\x2d9200b81948ee.mount: Deactivated successfully.
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.815 2 DEBUG nova.virt.libvirt.vif [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:13:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1016137733',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1016137733',id=24,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:13:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5a4169ef6e443b4a1f43aa9ac237c66',ramdisk_id='',reservation_id='r-rx7q1i29',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-898676780-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:14:53Z,user_data=None,user_id='ce5b7880b5ed459d9196d63a71180641',uuid=14ab7182-5946-4a3f-854e-62598e86e9ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "address": "fa:16:3e:df:fb:43", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa37754ca-59", "ovs_interfaceid": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.816 2 DEBUG nova.network.os_vif_util [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Converting VIF {"id": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "address": "fa:16:3e:df:fb:43", "network": {"id": "3a911305-1c41-4e8b-b203-9200b81948ee", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-245730221-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3de874d926748bd99c4598b8d738295", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa37754ca-59", "ovs_interfaceid": "a37754ca-59c3-4b33-94b2-fc7aec552c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.817 2 DEBUG nova.network.os_vif_util [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:fb:43,bridge_name='br-int',has_traffic_filtering=True,id=a37754ca-59c3-4b33-94b2-fc7aec552c76,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa37754ca-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.818 2 DEBUG os_vif [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:fb:43,bridge_name='br-int',has_traffic_filtering=True,id=a37754ca-59c3-4b33-94b2-fc7aec552c76,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa37754ca-59') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.821 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa37754ca-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.826 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5e710ca2-8b40-4372-8587-004762f63cbf) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.829 2 INFO os_vif [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:fb:43,bridge_name='br-int',has_traffic_filtering=True,id=a37754ca-59c3-4b33-94b2-fc7aec552c76,network=Network(3a911305-1c41-4e8b-b203-9200b81948ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa37754ca-59')
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.830 2 INFO nova.virt.libvirt.driver [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Deleting instance files /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee_del
Oct 07 22:15:08 compute-0 nova_compute[192716]: 2025-10-07 22:15:08.831 2 INFO nova.virt.libvirt.driver [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Deletion of /var/lib/nova/instances/14ab7182-5946-4a3f-854e-62598e86e9ee_del complete
Oct 07 22:15:09 compute-0 nova_compute[192716]: 2025-10-07 22:15:09.346 2 INFO nova.compute.manager [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 07 22:15:09 compute-0 nova_compute[192716]: 2025-10-07 22:15:09.347 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:15:09 compute-0 nova_compute[192716]: 2025-10-07 22:15:09.347 2 DEBUG nova.compute.manager [-] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:15:09 compute-0 nova_compute[192716]: 2025-10-07 22:15:09.348 2 DEBUG nova.network.neutron [-] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:15:09 compute-0 nova_compute[192716]: 2025-10-07 22:15:09.348 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.456 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.585 2 DEBUG nova.compute.manager [req-243fb678-1500-43a0-bd23-1ad829e90a46 req-a7f643da-ed5b-40e3-b5ee-e023b0ebc24d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Received event network-vif-unplugged-a37754ca-59c3-4b33-94b2-fc7aec552c76 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.585 2 DEBUG oslo_concurrency.lockutils [req-243fb678-1500-43a0-bd23-1ad829e90a46 req-a7f643da-ed5b-40e3-b5ee-e023b0ebc24d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.586 2 DEBUG oslo_concurrency.lockutils [req-243fb678-1500-43a0-bd23-1ad829e90a46 req-a7f643da-ed5b-40e3-b5ee-e023b0ebc24d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.586 2 DEBUG oslo_concurrency.lockutils [req-243fb678-1500-43a0-bd23-1ad829e90a46 req-a7f643da-ed5b-40e3-b5ee-e023b0ebc24d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.587 2 DEBUG nova.compute.manager [req-243fb678-1500-43a0-bd23-1ad829e90a46 req-a7f643da-ed5b-40e3-b5ee-e023b0ebc24d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] No waiting events found dispatching network-vif-unplugged-a37754ca-59c3-4b33-94b2-fc7aec552c76 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:15:10 compute-0 nova_compute[192716]: 2025-10-07 22:15:10.587 2 DEBUG nova.compute.manager [req-243fb678-1500-43a0-bd23-1ad829e90a46 req-a7f643da-ed5b-40e3-b5ee-e023b0ebc24d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Received event network-vif-unplugged-a37754ca-59c3-4b33-94b2-fc7aec552c76 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:15:12 compute-0 nova_compute[192716]: 2025-10-07 22:15:12.526 2 DEBUG nova.compute.manager [req-2b98a7db-d223-4d9b-af43-b9cd46b81d65 req-91409dbb-e1e6-477c-b162-074fd5e99f17 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Received event network-vif-deleted-a37754ca-59c3-4b33-94b2-fc7aec552c76 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:15:12 compute-0 nova_compute[192716]: 2025-10-07 22:15:12.526 2 INFO nova.compute.manager [req-2b98a7db-d223-4d9b-af43-b9cd46b81d65 req-91409dbb-e1e6-477c-b162-074fd5e99f17 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Neutron deleted interface a37754ca-59c3-4b33-94b2-fc7aec552c76; detaching it from the instance and deleting it from the info cache
Oct 07 22:15:12 compute-0 nova_compute[192716]: 2025-10-07 22:15:12.526 2 DEBUG nova.network.neutron [req-2b98a7db-d223-4d9b-af43-b9cd46b81d65 req-91409dbb-e1e6-477c-b162-074fd5e99f17 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:15:12 compute-0 nova_compute[192716]: 2025-10-07 22:15:12.973 2 DEBUG nova.network.neutron [-] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:15:13 compute-0 nova_compute[192716]: 2025-10-07 22:15:13.035 2 DEBUG nova.compute.manager [req-2b98a7db-d223-4d9b-af43-b9cd46b81d65 req-91409dbb-e1e6-477c-b162-074fd5e99f17 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Detach interface failed, port_id=a37754ca-59c3-4b33-94b2-fc7aec552c76, reason: Instance 14ab7182-5946-4a3f-854e-62598e86e9ee could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:15:13 compute-0 nova_compute[192716]: 2025-10-07 22:15:13.482 2 INFO nova.compute.manager [-] [instance: 14ab7182-5946-4a3f-854e-62598e86e9ee] Took 4.13 seconds to deallocate network for instance.
Oct 07 22:15:13 compute-0 nova_compute[192716]: 2025-10-07 22:15:13.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:14 compute-0 nova_compute[192716]: 2025-10-07 22:15:14.001 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:14 compute-0 nova_compute[192716]: 2025-10-07 22:15:14.001 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:14 compute-0 nova_compute[192716]: 2025-10-07 22:15:14.006 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:14 compute-0 nova_compute[192716]: 2025-10-07 22:15:14.044 2 INFO nova.scheduler.client.report [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Deleted allocations for instance 14ab7182-5946-4a3f-854e-62598e86e9ee
Oct 07 22:15:15 compute-0 nova_compute[192716]: 2025-10-07 22:15:15.074 2 DEBUG oslo_concurrency.lockutils [None req-0dae9ae8-69df-4ffc-bfc8-18129a21bac0 ce5b7880b5ed459d9196d63a71180641 d5a4169ef6e443b4a1f43aa9ac237c66 - - default default] Lock "14ab7182-5946-4a3f-854e-62598e86e9ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.585s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:15 compute-0 nova_compute[192716]: 2025-10-07 22:15:15.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:18 compute-0 nova_compute[192716]: 2025-10-07 22:15:18.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:19 compute-0 podman[225541]: 2025-10-07 22:15:19.836646937 +0000 UTC m=+0.071779319 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:15:19 compute-0 podman[225540]: 2025-10-07 22:15:19.837189823 +0000 UTC m=+0.068765251 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_id=iscsid, org.label-schema.schema-version=1.0)
Oct 07 22:15:20 compute-0 nova_compute[192716]: 2025-10-07 22:15:20.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:20 compute-0 nova_compute[192716]: 2025-10-07 22:15:20.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:22 compute-0 podman[225579]: 2025-10-07 22:15:22.821300645 +0000 UTC m=+0.066147036 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:15:23 compute-0 nova_compute[192716]: 2025-10-07 22:15:23.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:25 compute-0 nova_compute[192716]: 2025-10-07 22:15:25.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:25.653 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:25.653 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:25.654 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:28 compute-0 nova_compute[192716]: 2025-10-07 22:15:28.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:29 compute-0 podman[203153]: time="2025-10-07T22:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:15:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:15:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 07 22:15:30 compute-0 nova_compute[192716]: 2025-10-07 22:15:30.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:30 compute-0 podman[225606]: 2025-10-07 22:15:30.828106339 +0000 UTC m=+0.050646163 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 22:15:30 compute-0 podman[225605]: 2025-10-07 22:15:30.8609768 +0000 UTC m=+0.096266697 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: ERROR   22:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: ERROR   22:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: ERROR   22:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: ERROR   22:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: ERROR   22:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:15:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:15:32 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:32.580 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:b4:6d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f18b68e1837842c293e8b6a621641354', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b656ca07-6e70-4919-b525-077e26d9c217) old=Port_Binding(mac=['fa:16:3e:2b:b4:6d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f18b68e1837842c293e8b6a621641354', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:15:32 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:32.581 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b656ca07-6e70-4919-b525-077e26d9c217 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf updated
Oct 07 22:15:32 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:32.582 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network afe7be80-c16b-4cef-89c4-8851641c6faf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:15:32 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:32.584 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f76620-3b35-4e51-b87e-52fba2188c52]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:33 compute-0 nova_compute[192716]: 2025-10-07 22:15:33.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:35 compute-0 nova_compute[192716]: 2025-10-07 22:15:35.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:36 compute-0 podman[225650]: 2025-10-07 22:15:36.850979042 +0000 UTC m=+0.092805266 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 07 22:15:36 compute-0 nova_compute[192716]: 2025-10-07 22:15:36.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:36 compute-0 nova_compute[192716]: 2025-10-07 22:15:36.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:15:38 compute-0 nova_compute[192716]: 2025-10-07 22:15:38.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:39.857 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:a2:0e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-87a4c1f8-cce8-4de3-b5b1-62ee299a172c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87a4c1f8-cce8-4de3-b5b1-62ee299a172c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acbb10bf-81a7-4544-939c-c4532483b934, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9b0ecff1-51f5-4549-8749-422f37bc378a) old=Port_Binding(mac=['fa:16:3e:93:a2:0e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-87a4c1f8-cce8-4de3-b5b1-62ee299a172c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87a4c1f8-cce8-4de3-b5b1-62ee299a172c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:15:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:39.858 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9b0ecff1-51f5-4549-8749-422f37bc378a in datapath 87a4c1f8-cce8-4de3-b5b1-62ee299a172c updated
Oct 07 22:15:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:39.860 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 87a4c1f8-cce8-4de3-b5b1-62ee299a172c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:15:39 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:15:39.861 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[698e3867-b1c2-44dd-ad5e-b0dc3fe48351]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:15:39 compute-0 nova_compute[192716]: 2025-10-07 22:15:39.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:40 compute-0 nova_compute[192716]: 2025-10-07 22:15:40.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:41 compute-0 nova_compute[192716]: 2025-10-07 22:15:41.987 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:43 compute-0 nova_compute[192716]: 2025-10-07 22:15:43.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:44 compute-0 nova_compute[192716]: 2025-10-07 22:15:44.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.512 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.512 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.513 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.733 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.734 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.775 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.775 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.29899978637695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.776 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:15:45 compute-0 nova_compute[192716]: 2025-10-07 22:15:45.776 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:15:46 compute-0 nova_compute[192716]: 2025-10-07 22:15:46.825 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:15:46 compute-0 nova_compute[192716]: 2025-10-07 22:15:46.825 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:15:45 up  1:24,  0 user,  load average: 0.08, 0.15, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:15:46 compute-0 nova_compute[192716]: 2025-10-07 22:15:46.855 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:15:47 compute-0 nova_compute[192716]: 2025-10-07 22:15:47.362 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:15:47 compute-0 nova_compute[192716]: 2025-10-07 22:15:47.875 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:15:47 compute-0 nova_compute[192716]: 2025-10-07 22:15:47.876 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:15:48 compute-0 nova_compute[192716]: 2025-10-07 22:15:48.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:48 compute-0 nova_compute[192716]: 2025-10-07 22:15:48.875 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:50 compute-0 nova_compute[192716]: 2025-10-07 22:15:50.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:50 compute-0 podman[225674]: 2025-10-07 22:15:50.85711682 +0000 UTC m=+0.082436063 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd)
Oct 07 22:15:50 compute-0 podman[225673]: 2025-10-07 22:15:50.874708145 +0000 UTC m=+0.110561566 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 22:15:50 compute-0 nova_compute[192716]: 2025-10-07 22:15:50.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:51 compute-0 nova_compute[192716]: 2025-10-07 22:15:51.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:51 compute-0 nova_compute[192716]: 2025-10-07 22:15:51.498 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:53 compute-0 podman[225713]: 2025-10-07 22:15:53.82582466 +0000 UTC m=+0.067370072 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:15:53 compute-0 nova_compute[192716]: 2025-10-07 22:15:53.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:53 compute-0 ovn_controller[94904]: 2025-10-07T22:15:53Z|00222|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 07 22:15:53 compute-0 nova_compute[192716]: 2025-10-07 22:15:53.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:15:55 compute-0 nova_compute[192716]: 2025-10-07 22:15:55.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:57 compute-0 sshd-session[225737]: Invalid user github from 103.115.24.11 port 51526
Oct 07 22:15:57 compute-0 sshd-session[225737]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:15:57 compute-0 sshd-session[225737]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:15:58 compute-0 nova_compute[192716]: 2025-10-07 22:15:58.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:15:59 compute-0 podman[203153]: time="2025-10-07T22:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:15:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:15:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3026 "" "Go-http-client/1.1"
Oct 07 22:15:59 compute-0 sshd-session[225737]: Failed password for invalid user github from 103.115.24.11 port 51526 ssh2
Oct 07 22:16:00 compute-0 sshd-session[225737]: Received disconnect from 103.115.24.11 port 51526:11: Bye Bye [preauth]
Oct 07 22:16:00 compute-0 sshd-session[225737]: Disconnected from invalid user github 103.115.24.11 port 51526 [preauth]
Oct 07 22:16:00 compute-0 nova_compute[192716]: 2025-10-07 22:16:00.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: ERROR   22:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: ERROR   22:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: ERROR   22:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: ERROR   22:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: ERROR   22:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:16:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:16:01 compute-0 podman[225740]: 2025-10-07 22:16:01.856896174 +0000 UTC m=+0.092008723 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 07 22:16:01 compute-0 podman[225739]: 2025-10-07 22:16:01.872094079 +0000 UTC m=+0.109268638 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:16:03 compute-0 nova_compute[192716]: 2025-10-07 22:16:03.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:05 compute-0 nova_compute[192716]: 2025-10-07 22:16:05.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:07 compute-0 podman[225785]: 2025-10-07 22:16:07.820708659 +0000 UTC m=+0.063200099 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:16:08 compute-0 nova_compute[192716]: 2025-10-07 22:16:08.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:09 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:16:09.673 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:16:09 compute-0 nova_compute[192716]: 2025-10-07 22:16:09.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:09 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:16:09.674 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:16:10 compute-0 nova_compute[192716]: 2025-10-07 22:16:10.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:13 compute-0 nova_compute[192716]: 2025-10-07 22:16:13.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:15 compute-0 nova_compute[192716]: 2025-10-07 22:16:15.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:16:15.677 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:16:18 compute-0 nova_compute[192716]: 2025-10-07 22:16:18.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:20 compute-0 nova_compute[192716]: 2025-10-07 22:16:20.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:21 compute-0 podman[225811]: 2025-10-07 22:16:21.844815792 +0000 UTC m=+0.081197576 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 22:16:21 compute-0 podman[225810]: 2025-10-07 22:16:21.868721402 +0000 UTC m=+0.106927810 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:16:23 compute-0 nova_compute[192716]: 2025-10-07 22:16:23.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:24 compute-0 podman[225850]: 2025-10-07 22:16:24.835834575 +0000 UTC m=+0.072808431 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:16:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:16:25.655 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:16:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:16:25.655 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:16:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:16:25.655 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:16:25 compute-0 nova_compute[192716]: 2025-10-07 22:16:25.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:28 compute-0 nova_compute[192716]: 2025-10-07 22:16:28.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:29 compute-0 podman[203153]: time="2025-10-07T22:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:16:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:16:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3026 "" "Go-http-client/1.1"
Oct 07 22:16:30 compute-0 nova_compute[192716]: 2025-10-07 22:16:30.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: ERROR   22:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: ERROR   22:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: ERROR   22:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: ERROR   22:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: ERROR   22:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:16:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:16:32 compute-0 podman[225875]: 2025-10-07 22:16:32.87001282 +0000 UTC m=+0.109228587 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4)
Oct 07 22:16:32 compute-0 podman[225876]: 2025-10-07 22:16:32.87686274 +0000 UTC m=+0.101432598 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:16:33 compute-0 nova_compute[192716]: 2025-10-07 22:16:33.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:33 compute-0 nova_compute[192716]: 2025-10-07 22:16:33.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:35 compute-0 nova_compute[192716]: 2025-10-07 22:16:35.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:38 compute-0 podman[225920]: 2025-10-07 22:16:38.835432391 +0000 UTC m=+0.074329636 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, version=9.6, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Oct 07 22:16:38 compute-0 nova_compute[192716]: 2025-10-07 22:16:38.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:39 compute-0 nova_compute[192716]: 2025-10-07 22:16:39.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:39 compute-0 nova_compute[192716]: 2025-10-07 22:16:39.496 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:16:40 compute-0 nova_compute[192716]: 2025-10-07 22:16:40.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:40 compute-0 nova_compute[192716]: 2025-10-07 22:16:40.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:42 compute-0 nova_compute[192716]: 2025-10-07 22:16:42.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:43 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 22:16:43 compute-0 nova_compute[192716]: 2025-10-07 22:16:43.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:45 compute-0 nova_compute[192716]: 2025-10-07 22:16:45.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:45 compute-0 nova_compute[192716]: 2025-10-07 22:16:45.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:45 compute-0 nova_compute[192716]: 2025-10-07 22:16:45.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.505 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.506 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.726 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.727 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.768 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.769 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.29897689819336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.769 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:16:46 compute-0 nova_compute[192716]: 2025-10-07 22:16:46.769 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.826 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.827 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:16:46 up  1:25,  0 user,  load average: 0.03, 0.12, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.843 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.866 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.867 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.886 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.908 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 22:16:47 compute-0 nova_compute[192716]: 2025-10-07 22:16:47.929 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:16:48 compute-0 nova_compute[192716]: 2025-10-07 22:16:48.437 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:16:48 compute-0 nova_compute[192716]: 2025-10-07 22:16:48.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:48 compute-0 nova_compute[192716]: 2025-10-07 22:16:48.948 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:16:48 compute-0 nova_compute[192716]: 2025-10-07 22:16:48.949 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.179s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:16:49 compute-0 nova_compute[192716]: 2025-10-07 22:16:49.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:49 compute-0 nova_compute[192716]: 2025-10-07 22:16:49.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 22:16:50 compute-0 nova_compute[192716]: 2025-10-07 22:16:50.499 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 22:16:50 compute-0 nova_compute[192716]: 2025-10-07 22:16:50.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:51 compute-0 nova_compute[192716]: 2025-10-07 22:16:51.499 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:51 compute-0 nova_compute[192716]: 2025-10-07 22:16:51.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:52 compute-0 podman[225945]: 2025-10-07 22:16:52.806085719 +0000 UTC m=+0.052571078 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:16:52 compute-0 podman[225946]: 2025-10-07 22:16:52.820043698 +0000 UTC m=+0.058713679 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:16:53 compute-0 nova_compute[192716]: 2025-10-07 22:16:53.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:55 compute-0 nova_compute[192716]: 2025-10-07 22:16:55.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:55 compute-0 podman[225986]: 2025-10-07 22:16:55.827612175 +0000 UTC m=+0.065696782 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:16:55 compute-0 nova_compute[192716]: 2025-10-07 22:16:55.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:16:57 compute-0 sshd-session[225984]: Connection reset by 205.210.31.226 port 60338 [preauth]
Oct 07 22:16:58 compute-0 nova_compute[192716]: 2025-10-07 22:16:58.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:16:59 compute-0 podman[203153]: time="2025-10-07T22:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:16:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:16:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 07 22:17:00 compute-0 nova_compute[192716]: 2025-10-07 22:17:00.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: ERROR   22:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: ERROR   22:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: ERROR   22:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: ERROR   22:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: ERROR   22:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:17:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:17:03 compute-0 podman[226010]: 2025-10-07 22:17:03.831492793 +0000 UTC m=+0.066248739 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0)
Oct 07 22:17:03 compute-0 podman[226009]: 2025-10-07 22:17:03.870963328 +0000 UTC m=+0.115038067 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:17:03 compute-0 nova_compute[192716]: 2025-10-07 22:17:03.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:05 compute-0 nova_compute[192716]: 2025-10-07 22:17:05.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:05 compute-0 nova_compute[192716]: 2025-10-07 22:17:05.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:05 compute-0 nova_compute[192716]: 2025-10-07 22:17:05.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 22:17:07 compute-0 nova_compute[192716]: 2025-10-07 22:17:07.008 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Creating tmpfile /var/lib/nova/instances/tmph7lay42o to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:17:07 compute-0 nova_compute[192716]: 2025-10-07 22:17:07.009 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:07 compute-0 nova_compute[192716]: 2025-10-07 22:17:07.054 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Creating tmpfile /var/lib/nova/instances/tmp8n0ftfb_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:17:07 compute-0 nova_compute[192716]: 2025-10-07 22:17:07.055 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:07 compute-0 nova_compute[192716]: 2025-10-07 22:17:07.094 2 DEBUG nova.compute.manager [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7lay42o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:17:07 compute-0 nova_compute[192716]: 2025-10-07 22:17:07.136 2 DEBUG nova.compute.manager [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8n0ftfb_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:17:08 compute-0 nova_compute[192716]: 2025-10-07 22:17:08.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:09 compute-0 nova_compute[192716]: 2025-10-07 22:17:09.144 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:09 compute-0 nova_compute[192716]: 2025-10-07 22:17:09.159 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:09 compute-0 podman[226053]: 2025-10-07 22:17:09.857671513 +0000 UTC m=+0.089413497 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm)
Oct 07 22:17:10 compute-0 nova_compute[192716]: 2025-10-07 22:17:10.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:13 compute-0 nova_compute[192716]: 2025-10-07 22:17:13.518 2 DEBUG nova.compute.manager [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7lay42o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e7c02487-5b40-4170-ba2a-9c025aae40b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:17:13 compute-0 nova_compute[192716]: 2025-10-07 22:17:13.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:14 compute-0 nova_compute[192716]: 2025-10-07 22:17:14.534 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-e7c02487-5b40-4170-ba2a-9c025aae40b9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:17:14 compute-0 nova_compute[192716]: 2025-10-07 22:17:14.534 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-e7c02487-5b40-4170-ba2a-9c025aae40b9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:17:14 compute-0 nova_compute[192716]: 2025-10-07 22:17:14.535 2 DEBUG nova.network.neutron [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:17:15 compute-0 nova_compute[192716]: 2025-10-07 22:17:15.045 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:15 compute-0 nova_compute[192716]: 2025-10-07 22:17:15.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:15 compute-0 nova_compute[192716]: 2025-10-07 22:17:15.797 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:15 compute-0 nova_compute[192716]: 2025-10-07 22:17:15.964 2 DEBUG nova.network.neutron [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Updating instance_info_cache with network_info: [{"id": "cf1b9092-14e3-4326-99c8-2801b53abb26", "address": "fa:16:3e:6c:2c:cf", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf1b9092-14", "ovs_interfaceid": "cf1b9092-14e3-4326-99c8-2801b53abb26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:17:16 compute-0 nova_compute[192716]: 2025-10-07 22:17:16.470 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-e7c02487-5b40-4170-ba2a-9c025aae40b9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:17:16 compute-0 nova_compute[192716]: 2025-10-07 22:17:16.487 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7lay42o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e7c02487-5b40-4170-ba2a-9c025aae40b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:17:16 compute-0 nova_compute[192716]: 2025-10-07 22:17:16.488 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Creating instance directory: /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:17:16 compute-0 nova_compute[192716]: 2025-10-07 22:17:16.489 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Creating disk.info with the contents: {'/var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk': 'qcow2', '/var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:17:16 compute-0 nova_compute[192716]: 2025-10-07 22:17:16.490 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:17:16 compute-0 nova_compute[192716]: 2025-10-07 22:17:16.491 2 DEBUG nova.objects.instance [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e7c02487-5b40-4170-ba2a-9c025aae40b9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.000 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.003 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.004 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.070 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.071 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.071 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.072 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.075 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.075 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.126 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.127 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.165 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.166 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.166 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.232 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.233 2 DEBUG nova.virt.disk.api [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.234 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.297 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.299 2 DEBUG nova.virt.disk.api [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.299 2 DEBUG nova.objects.instance [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid e7c02487-5b40-4170-ba2a-9c025aae40b9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.807 2 DEBUG nova.objects.base [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<e7c02487-5b40-4170-ba2a-9c025aae40b9> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.808 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.839 2 DEBUG oslo_concurrency.processutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk.config 497664" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.840 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.841 2 DEBUG nova.virt.libvirt.vif [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1445845716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1445845',id=26,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:16:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-u6ansgmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:16:11Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=e7c02487-5b40-4170-ba2a-9c025aae40b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cf1b9092-14e3-4326-99c8-2801b53abb26", "address": "fa:16:3e:6c:2c:cf", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcf1b9092-14", "ovs_interfaceid": "cf1b9092-14e3-4326-99c8-2801b53abb26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.841 2 DEBUG nova.network.os_vif_util [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "cf1b9092-14e3-4326-99c8-2801b53abb26", "address": "fa:16:3e:6c:2c:cf", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcf1b9092-14", "ovs_interfaceid": "cf1b9092-14e3-4326-99c8-2801b53abb26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.842 2 DEBUG nova.network.os_vif_util [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:2c:cf,bridge_name='br-int',has_traffic_filtering=True,id=cf1b9092-14e3-4326-99c8-2801b53abb26,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf1b9092-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.842 2 DEBUG os_vif [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:2c:cf,bridge_name='br-int',has_traffic_filtering=True,id=cf1b9092-14e3-4326-99c8-2801b53abb26,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf1b9092-14') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.844 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5defd495-7ce6-5c49-b0b9-e69143c981e5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf1b9092-14, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcf1b9092-14, col_values=(('qos', UUID('51fa3eb2-22be-46c0-b3da-fe2523e59f24')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcf1b9092-14, col_values=(('external_ids', {'iface-id': 'cf1b9092-14e3-4326-99c8-2801b53abb26', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:2c:cf', 'vm-uuid': 'e7c02487-5b40-4170-ba2a-9c025aae40b9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:17 compute-0 NetworkManager[51722]: <info>  [1759875437.8524] manager: (tapcf1b9092-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.859 2 INFO os_vif [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:2c:cf,bridge_name='br-int',has_traffic_filtering=True,id=cf1b9092-14e3-4326-99c8-2801b53abb26,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf1b9092-14')
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.860 2 DEBUG nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.860 2 DEBUG nova.compute.manager [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7lay42o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e7c02487-5b40-4170-ba2a-9c025aae40b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.861 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:17 compute-0 nova_compute[192716]: 2025-10-07 22:17:17.949 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:18 compute-0 nova_compute[192716]: 2025-10-07 22:17:18.987 2 DEBUG nova.network.neutron [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Port cf1b9092-14e3-4326-99c8-2801b53abb26 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:17:19 compute-0 nova_compute[192716]: 2025-10-07 22:17:19.001 2 DEBUG nova.compute.manager [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7lay42o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e7c02487-5b40-4170-ba2a-9c025aae40b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:17:20 compute-0 nova_compute[192716]: 2025-10-07 22:17:20.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 22:17:22 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 22:17:22 compute-0 kernel: tapcf1b9092-14: entered promiscuous mode
Oct 07 22:17:22 compute-0 NetworkManager[51722]: <info>  [1759875442.5345] manager: (tapcf1b9092-14): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct 07 22:17:22 compute-0 ovn_controller[94904]: 2025-10-07T22:17:22Z|00223|binding|INFO|Claiming lport cf1b9092-14e3-4326-99c8-2801b53abb26 for this additional chassis.
Oct 07 22:17:22 compute-0 ovn_controller[94904]: 2025-10-07T22:17:22Z|00224|binding|INFO|cf1b9092-14e3-4326-99c8-2801b53abb26: Claiming fa:16:3e:6c:2c:cf 10.100.0.14
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.555 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:2c:cf 10.100.0.14'], port_security=['fa:16:3e:6c:2c:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e7c02487-5b40-4170-ba2a-9c025aae40b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=cf1b9092-14e3-4326-99c8-2801b53abb26) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.556 103791 INFO neutron.agent.ovn.metadata.agent [-] Port cf1b9092-14e3-4326-99c8-2801b53abb26 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.557 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:17:22 compute-0 systemd-udevd[226127]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.574 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc7811f-7184-4c64-b2c8-ba411de8b7f3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.575 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapafe7be80-c1 in ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.577 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapafe7be80-c0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.577 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[855cec27-b0c9-4bd5-975c-805152c2453e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.578 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[37abaed3-acd1-45eb-9a2d-8e1ddf1f6170]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 NetworkManager[51722]: <info>  [1759875442.5862] device (tapcf1b9092-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:17:22 compute-0 NetworkManager[51722]: <info>  [1759875442.5874] device (tapcf1b9092-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.593 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[991871db-6514-4902-a71a-488cc20294a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 systemd-machined[152719]: New machine qemu-19-instance-0000001a.
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.630 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8f46274c-46df-47eb-ad8b-6c26ab252cb2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000001a.
Oct 07 22:17:22 compute-0 ovn_controller[94904]: 2025-10-07T22:17:22Z|00225|binding|INFO|Setting lport cf1b9092-14e3-4326-99c8-2801b53abb26 ovn-installed in OVS
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.672 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb9bae5-286b-40cd-b859-93e9d3d345ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.677 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfe7032-449d-4712-8cda-f588b3b9e45e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 NetworkManager[51722]: <info>  [1759875442.6798] manager: (tapafe7be80-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.727 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[c42fde32-a87d-418a-afa5-9cd923511bea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.730 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9da87b21-e1df-4ca5-b124-82363265e519]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 NetworkManager[51722]: <info>  [1759875442.7567] device (tapafe7be80-c0): carrier: link connected
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.762 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2da4784e-bd77-48ed-95f9-e509627d96bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.784 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d90b011d-de74-47e1-8656-6323b03eb065]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517633, 'reachable_time': 17872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226160, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.807 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca65ea9-f7e0-493c-8218-42cdd6a7b2c9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:b46d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517633, 'tstamp': 517633}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226161, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.834 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ce51f8e9-0c65-4f01-89ae-7d9a1d5260b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517633, 'reachable_time': 17872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226162, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.877 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fc47341a-2109-4bd8-b9ed-c84b382985e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.973 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9cfd1a-3631-4f3f-889f-fe77c28780f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.975 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.976 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.976 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe7be80-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 kernel: tapafe7be80-c0: entered promiscuous mode
Oct 07 22:17:22 compute-0 NetworkManager[51722]: <info>  [1759875442.9800] manager: (tapafe7be80-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:22.983 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafe7be80-c0, col_values=(('external_ids', {'iface-id': 'b656ca07-6e70-4919-b525-077e26d9c217'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:22 compute-0 nova_compute[192716]: 2025-10-07 22:17:22.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:22 compute-0 ovn_controller[94904]: 2025-10-07T22:17:22Z|00226|binding|INFO|Releasing lport b656ca07-6e70-4919-b525-077e26d9c217 from this chassis (sb_readonly=0)
Oct 07 22:17:23 compute-0 nova_compute[192716]: 2025-10-07 22:17:23.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:23 compute-0 nova_compute[192716]: 2025-10-07 22:17:23.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.015 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[534824a8-7684-4bdf-8cab-865acb21ba2d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.016 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.017 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.017 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for afe7be80-c16b-4cef-89c4-8851641c6faf disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.017 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.018 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d74e2d13-d58a-4a98-adab-d585dcbbae30]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.018 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.019 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0702d3e4-adce-4535-a01e-8f643d0debb4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.020 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:17:23 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:23.020 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'env', 'PROCESS_TAG=haproxy-afe7be80-c16b-4cef-89c4-8851641c6faf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/afe7be80-c16b-4cef-89c4-8851641c6faf.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:17:23 compute-0 podman[226201]: 2025-10-07 22:17:23.502428029 +0000 UTC m=+0.058439690 container create 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:17:23 compute-0 systemd[1]: Started libpod-conmon-734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7.scope.
Oct 07 22:17:23 compute-0 podman[226201]: 2025-10-07 22:17:23.474810332 +0000 UTC m=+0.030822003 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:17:23 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:17:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/311fc502461087d7dd92bf90ab2f746dfa2e44dacded5f03a3b714f6fa7769ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:17:23 compute-0 podman[226201]: 2025-10-07 22:17:23.593099722 +0000 UTC m=+0.149111453 container init 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:17:23 compute-0 podman[226201]: 2025-10-07 22:17:23.602072575 +0000 UTC m=+0.158084236 container start 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Oct 07 22:17:23 compute-0 podman[226214]: 2025-10-07 22:17:23.616081354 +0000 UTC m=+0.064053014 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:17:23 compute-0 podman[226217]: 2025-10-07 22:17:23.61728958 +0000 UTC m=+0.071541514 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct 07 22:17:23 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [NOTICE]   (226253) : New worker (226260) forked
Oct 07 22:17:23 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [NOTICE]   (226253) : Loading success.
Oct 07 22:17:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:24.828 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:17:24 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:24.842 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:17:24 compute-0 nova_compute[192716]: 2025-10-07 22:17:24.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:25 compute-0 ovn_controller[94904]: 2025-10-07T22:17:25Z|00227|binding|INFO|Claiming lport cf1b9092-14e3-4326-99c8-2801b53abb26 for this chassis.
Oct 07 22:17:25 compute-0 ovn_controller[94904]: 2025-10-07T22:17:25Z|00228|binding|INFO|cf1b9092-14e3-4326-99c8-2801b53abb26: Claiming fa:16:3e:6c:2c:cf 10.100.0.14
Oct 07 22:17:25 compute-0 ovn_controller[94904]: 2025-10-07T22:17:25Z|00229|binding|INFO|Setting lport cf1b9092-14e3-4326-99c8-2801b53abb26 up in Southbound
Oct 07 22:17:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:25.656 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:17:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:25.656 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:17:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:25.657 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:17:25 compute-0 nova_compute[192716]: 2025-10-07 22:17:25.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:25.843 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.396 2 INFO nova.compute.manager [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Post operation of migration started
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.397 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.530 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.530 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.651 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-e7c02487-5b40-4170-ba2a-9c025aae40b9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.651 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-e7c02487-5b40-4170-ba2a-9c025aae40b9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:17:26 compute-0 nova_compute[192716]: 2025-10-07 22:17:26.652 2 DEBUG nova.network.neutron [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:17:26 compute-0 podman[226280]: 2025-10-07 22:17:26.829586407 +0000 UTC m=+0.071674408 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:17:27 compute-0 nova_compute[192716]: 2025-10-07 22:17:27.159 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:27 compute-0 nova_compute[192716]: 2025-10-07 22:17:27.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:27 compute-0 nova_compute[192716]: 2025-10-07 22:17:27.981 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:28 compute-0 nova_compute[192716]: 2025-10-07 22:17:28.216 2 DEBUG nova.network.neutron [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Updating instance_info_cache with network_info: [{"id": "cf1b9092-14e3-4326-99c8-2801b53abb26", "address": "fa:16:3e:6c:2c:cf", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf1b9092-14", "ovs_interfaceid": "cf1b9092-14e3-4326-99c8-2801b53abb26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:17:28 compute-0 nova_compute[192716]: 2025-10-07 22:17:28.723 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-e7c02487-5b40-4170-ba2a-9c025aae40b9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:17:29 compute-0 nova_compute[192716]: 2025-10-07 22:17:29.246 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:17:29 compute-0 nova_compute[192716]: 2025-10-07 22:17:29.246 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:17:29 compute-0 nova_compute[192716]: 2025-10-07 22:17:29.247 2 DEBUG oslo_concurrency.lockutils [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:17:29 compute-0 nova_compute[192716]: 2025-10-07 22:17:29.253 2 INFO nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:17:29 compute-0 virtqemud[192532]: Domain id=19 name='instance-0000001a' uuid=e7c02487-5b40-4170-ba2a-9c025aae40b9 is tainted: custom-monitor
Oct 07 22:17:29 compute-0 podman[203153]: time="2025-10-07T22:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:17:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:17:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3492 "" "Go-http-client/1.1"
Oct 07 22:17:30 compute-0 nova_compute[192716]: 2025-10-07 22:17:30.263 2 INFO nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:17:30 compute-0 nova_compute[192716]: 2025-10-07 22:17:30.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:31 compute-0 nova_compute[192716]: 2025-10-07 22:17:31.270 2 INFO nova.virt.libvirt.driver [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:17:31 compute-0 nova_compute[192716]: 2025-10-07 22:17:31.275 2 DEBUG nova.compute.manager [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: ERROR   22:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: ERROR   22:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: ERROR   22:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: ERROR   22:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: ERROR   22:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:17:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:17:31 compute-0 nova_compute[192716]: 2025-10-07 22:17:31.787 2 DEBUG nova.objects.instance [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:17:32 compute-0 nova_compute[192716]: 2025-10-07 22:17:32.806 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:32 compute-0 nova_compute[192716]: 2025-10-07 22:17:32.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:33 compute-0 nova_compute[192716]: 2025-10-07 22:17:33.524 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:33 compute-0 nova_compute[192716]: 2025-10-07 22:17:33.525 2 WARNING neutronclient.v2_0.client [None req-6cfe9604-7387-464a-884c-91f062960b86 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:34 compute-0 podman[226306]: 2025-10-07 22:17:34.863352508 +0000 UTC m=+0.088942253 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 07 22:17:34 compute-0 podman[226305]: 2025-10-07 22:17:34.938434514 +0000 UTC m=+0.170912591 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 07 22:17:35 compute-0 nova_compute[192716]: 2025-10-07 22:17:35.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:37 compute-0 nova_compute[192716]: 2025-10-07 22:17:37.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:40 compute-0 nova_compute[192716]: 2025-10-07 22:17:40.501 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:40 compute-0 nova_compute[192716]: 2025-10-07 22:17:40.502 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:17:40 compute-0 nova_compute[192716]: 2025-10-07 22:17:40.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:40 compute-0 podman[226352]: 2025-10-07 22:17:40.85739649 +0000 UTC m=+0.085319598 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 07 22:17:42 compute-0 nova_compute[192716]: 2025-10-07 22:17:42.040 2 DEBUG nova.compute.manager [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8n0ftfb_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5b094b64-8b68-4676-b259-85b0d42a9ec1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:17:42 compute-0 nova_compute[192716]: 2025-10-07 22:17:42.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:42 compute-0 nova_compute[192716]: 2025-10-07 22:17:42.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:43 compute-0 nova_compute[192716]: 2025-10-07 22:17:43.057 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-5b094b64-8b68-4676-b259-85b0d42a9ec1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:17:43 compute-0 nova_compute[192716]: 2025-10-07 22:17:43.057 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-5b094b64-8b68-4676-b259-85b0d42a9ec1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:17:43 compute-0 nova_compute[192716]: 2025-10-07 22:17:43.057 2 DEBUG nova.network.neutron [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:17:43 compute-0 nova_compute[192716]: 2025-10-07 22:17:43.563 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:43 compute-0 nova_compute[192716]: 2025-10-07 22:17:43.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:44 compute-0 nova_compute[192716]: 2025-10-07 22:17:44.634 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:45 compute-0 nova_compute[192716]: 2025-10-07 22:17:45.518 2 DEBUG nova.network.neutron [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Updating instance_info_cache with network_info: [{"id": "a85533b2-ea61-40a6-a339-d4654846b93d", "address": "fa:16:3e:a6:0d:ee", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa85533b2-ea", "ovs_interfaceid": "a85533b2-ea61-40a6-a339-d4654846b93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:17:45 compute-0 nova_compute[192716]: 2025-10-07 22:17:45.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.026 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-5b094b64-8b68-4676-b259-85b0d42a9ec1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.042 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8n0ftfb_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5b094b64-8b68-4676-b259-85b0d42a9ec1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.043 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Creating instance directory: /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.044 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Creating disk.info with the contents: {'/var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk': 'qcow2', '/var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.045 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.046 2 DEBUG nova.objects.instance [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5b094b64-8b68-4676-b259-85b0d42a9ec1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.555 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.561 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.564 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.621 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.623 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.624 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.625 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.631 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.632 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.721 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.723 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.766 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.768 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.769 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.851 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.852 2 DEBUG nova.virt.disk.api [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.852 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.916 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.917 2 DEBUG nova.virt.disk.api [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.918 2 DEBUG nova.objects.instance [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid 5b094b64-8b68-4676-b259-85b0d42a9ec1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:46 compute-0 nova_compute[192716]: 2025-10-07 22:17:46.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.426 2 DEBUG nova.objects.base [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<5b094b64-8b68-4676-b259-85b0d42a9ec1> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.427 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.456 2 DEBUG oslo_concurrency.processutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.457 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.459 2 DEBUG nova.virt.libvirt.vif [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:16:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137573810',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137573',id=27,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:16:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-p0cxtqp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:16:34Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=5b094b64-8b68-4676-b259-85b0d42a9ec1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a85533b2-ea61-40a6-a339-d4654846b93d", "address": "fa:16:3e:a6:0d:ee", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa85533b2-ea", "ovs_interfaceid": "a85533b2-ea61-40a6-a339-d4654846b93d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.459 2 DEBUG nova.network.os_vif_util [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "a85533b2-ea61-40a6-a339-d4654846b93d", "address": "fa:16:3e:a6:0d:ee", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa85533b2-ea", "ovs_interfaceid": "a85533b2-ea61-40a6-a339-d4654846b93d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.460 2 DEBUG nova.network.os_vif_util [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:0d:ee,bridge_name='br-int',has_traffic_filtering=True,id=a85533b2-ea61-40a6-a339-d4654846b93d,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa85533b2-ea') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.460 2 DEBUG os_vif [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:0d:ee,bridge_name='br-int',has_traffic_filtering=True,id=a85533b2-ea61-40a6-a339-d4654846b93d,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa85533b2-ea') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9c10d05d-7b96-58da-9b4e-7f4cac952ecd', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.473 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa85533b2-ea, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa85533b2-ea, col_values=(('qos', UUID('4655fbed-9aa2-4ae6-bec3-05cfcb44cf83')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa85533b2-ea, col_values=(('external_ids', {'iface-id': 'a85533b2-ea61-40a6-a339-d4654846b93d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:0d:ee', 'vm-uuid': '5b094b64-8b68-4676-b259-85b0d42a9ec1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:47 compute-0 NetworkManager[51722]: <info>  [1759875467.4783] manager: (tapa85533b2-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.486 2 INFO os_vif [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:0d:ee,bridge_name='br-int',has_traffic_filtering=True,id=a85533b2-ea61-40a6-a339-d4654846b93d,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa85533b2-ea')
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.487 2 DEBUG nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.488 2 DEBUG nova.compute.manager [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8n0ftfb_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5b094b64-8b68-4676-b259-85b0d42a9ec1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.489 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.508 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.509 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:17:47 compute-0 nova_compute[192716]: 2025-10-07 22:17:47.600 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.566 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.588 2 DEBUG nova.network.neutron [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Port a85533b2-ea61-40a6-a339-d4654846b93d updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.602 2 DEBUG nova.compute.manager [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8n0ftfb_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5b094b64-8b68-4676-b259-85b0d42a9ec1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.636 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.636 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.721 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.954 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.956 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.988 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.989 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=73.26925659179688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.989 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:17:48 compute-0 nova_compute[192716]: 2025-10-07 22:17:48.989 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:17:50 compute-0 nova_compute[192716]: 2025-10-07 22:17:50.518 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration for instance 5b094b64-8b68-4676-b259-85b0d42a9ec1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 22:17:50 compute-0 nova_compute[192716]: 2025-10-07 22:17:50.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:51 compute-0 nova_compute[192716]: 2025-10-07 22:17:51.026 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Updating resource usage from migration c3fdd511-7a48-4807-9eed-fcef8b9e5986
Oct 07 22:17:51 compute-0 nova_compute[192716]: 2025-10-07 22:17:51.026 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Starting to track incoming migration c3fdd511-7a48-4807-9eed-fcef8b9e5986 with flavor e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 22:17:51 compute-0 nova_compute[192716]: 2025-10-07 22:17:51.637 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance e7c02487-5b40-4170-ba2a-9c025aae40b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:17:52 compute-0 kernel: tapa85533b2-ea: entered promiscuous mode
Oct 07 22:17:52 compute-0 NetworkManager[51722]: <info>  [1759875472.0816] manager: (tapa85533b2-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Oct 07 22:17:52 compute-0 ovn_controller[94904]: 2025-10-07T22:17:52Z|00230|binding|INFO|Claiming lport a85533b2-ea61-40a6-a339-d4654846b93d for this additional chassis.
Oct 07 22:17:52 compute-0 ovn_controller[94904]: 2025-10-07T22:17:52Z|00231|binding|INFO|a85533b2-ea61-40a6-a339-d4654846b93d: Claiming fa:16:3e:a6:0d:ee 10.100.0.9
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.094 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:0d:ee 10.100.0.9'], port_security=['fa:16:3e:a6:0d:ee 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5b094b64-8b68-4676-b259-85b0d42a9ec1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=a85533b2-ea61-40a6-a339-d4654846b93d) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.095 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a85533b2-ea61-40a6-a339-d4654846b93d in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.098 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:17:52 compute-0 ovn_controller[94904]: 2025-10-07T22:17:52Z|00232|binding|INFO|Setting lport a85533b2-ea61-40a6-a339-d4654846b93d ovn-installed in OVS
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:52 compute-0 systemd-udevd[226416]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:17:52 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 22:17:52 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.122 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c8915630-4712-4b79-b388-0575b8859cfa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 systemd-machined[152719]: New machine qemu-20-instance-0000001b.
Oct 07 22:17:52 compute-0 NetworkManager[51722]: <info>  [1759875472.1302] device (tapa85533b2-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:17:52 compute-0 NetworkManager[51722]: <info>  [1759875472.1310] device (tapa85533b2-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:17:52 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001b.
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.145 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance 5b094b64-8b68-4676-b259-85b0d42a9ec1 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.145 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.145 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:17:48 up  1:26,  0 user,  load average: 0.14, 0.12, 0.19\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_4cb01004a26f472187e01e5d3a57f84a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.166 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1ed4f3-5b6a-4777-8349-4072a37312d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.170 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0a11b6-4b90-4335-b3c0-b55e4ebb29ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.206 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[efe6cb5f-8a70-4e10-994d-1268037cfe62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.220 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.229 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9a735abe-99ea-4781-8461-4b8491650252]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517633, 'reachable_time': 17872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226431, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.253 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[878d062b-72bb-47ac-84aa-6a6a6b521827]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517650, 'tstamp': 517650}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226433, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517654, 'tstamp': 517654}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226433, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.254 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.258 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe7be80-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.259 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.259 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafe7be80-c0, col_values=(('external_ids', {'iface-id': 'b656ca07-6e70-4919-b525-077e26d9c217'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.260 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:17:52 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:17:52.261 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0d81d299-c8e0-4e48-bacd-5400cefbcf73]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-afe7be80-c16b-4cef-89c4-8851641c6faf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID afe7be80-c16b-4cef-89c4-8851641c6faf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:52 compute-0 nova_compute[192716]: 2025-10-07 22:17:52.727 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:17:53 compute-0 nova_compute[192716]: 2025-10-07 22:17:53.240 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:17:53 compute-0 nova_compute[192716]: 2025-10-07 22:17:53.241 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.251s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:17:53 compute-0 podman[226442]: 2025-10-07 22:17:53.825005035 +0000 UTC m=+0.063290692 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:17:53 compute-0 podman[226441]: 2025-10-07 22:17:53.860637468 +0000 UTC m=+0.098377389 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 22:17:55 compute-0 unix_chkpwd[226491]: password check failed for user (root)
Oct 07 22:17:55 compute-0 sshd-session[226489]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 22:17:55 compute-0 nova_compute[192716]: 2025-10-07 22:17:55.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:55 compute-0 ovn_controller[94904]: 2025-10-07T22:17:55Z|00233|binding|INFO|Claiming lport a85533b2-ea61-40a6-a339-d4654846b93d for this chassis.
Oct 07 22:17:55 compute-0 ovn_controller[94904]: 2025-10-07T22:17:55Z|00234|binding|INFO|a85533b2-ea61-40a6-a339-d4654846b93d: Claiming fa:16:3e:a6:0d:ee 10.100.0.9
Oct 07 22:17:55 compute-0 ovn_controller[94904]: 2025-10-07T22:17:55Z|00235|binding|INFO|Setting lport a85533b2-ea61-40a6-a339-d4654846b93d up in Southbound
Oct 07 22:17:56 compute-0 nova_compute[192716]: 2025-10-07 22:17:56.995 2 INFO nova.compute.manager [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Post operation of migration started
Oct 07 22:17:56 compute-0 nova_compute[192716]: 2025-10-07 22:17:56.996 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:57 compute-0 sshd-session[226489]: Failed password for root from 193.46.255.244 port 64770 ssh2
Oct 07 22:17:57 compute-0 nova_compute[192716]: 2025-10-07 22:17:57.531 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:57 compute-0 nova_compute[192716]: 2025-10-07 22:17:57.531 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:57 compute-0 nova_compute[192716]: 2025-10-07 22:17:57.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:17:57 compute-0 nova_compute[192716]: 2025-10-07 22:17:57.630 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-5b094b64-8b68-4676-b259-85b0d42a9ec1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:17:57 compute-0 nova_compute[192716]: 2025-10-07 22:17:57.630 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-5b094b64-8b68-4676-b259-85b0d42a9ec1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:17:57 compute-0 nova_compute[192716]: 2025-10-07 22:17:57.630 2 DEBUG nova.network.neutron [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:17:57 compute-0 unix_chkpwd[226492]: password check failed for user (root)
Oct 07 22:17:57 compute-0 podman[226493]: 2025-10-07 22:17:57.835682151 +0000 UTC m=+0.074999535 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:17:58 compute-0 nova_compute[192716]: 2025-10-07 22:17:58.138 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:58 compute-0 nova_compute[192716]: 2025-10-07 22:17:58.832 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:17:59 compute-0 nova_compute[192716]: 2025-10-07 22:17:59.043 2 DEBUG nova.network.neutron [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Updating instance_info_cache with network_info: [{"id": "a85533b2-ea61-40a6-a339-d4654846b93d", "address": "fa:16:3e:a6:0d:ee", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa85533b2-ea", "ovs_interfaceid": "a85533b2-ea61-40a6-a339-d4654846b93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:17:59 compute-0 nova_compute[192716]: 2025-10-07 22:17:59.241 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:59 compute-0 nova_compute[192716]: 2025-10-07 22:17:59.551 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-5b094b64-8b68-4676-b259-85b0d42a9ec1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:17:59 compute-0 nova_compute[192716]: 2025-10-07 22:17:59.756 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:59 compute-0 nova_compute[192716]: 2025-10-07 22:17:59.756 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:59 compute-0 podman[203153]: time="2025-10-07T22:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:17:59 compute-0 nova_compute[192716]: 2025-10-07 22:17:59.757 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:17:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:17:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3487 "" "Go-http-client/1.1"
Oct 07 22:17:59 compute-0 sshd-session[226489]: Failed password for root from 193.46.255.244 port 64770 ssh2
Oct 07 22:18:00 compute-0 nova_compute[192716]: 2025-10-07 22:18:00.077 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:00 compute-0 nova_compute[192716]: 2025-10-07 22:18:00.077 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:00 compute-0 nova_compute[192716]: 2025-10-07 22:18:00.078 2 DEBUG oslo_concurrency.lockutils [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:00 compute-0 nova_compute[192716]: 2025-10-07 22:18:00.084 2 INFO nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:18:00 compute-0 virtqemud[192532]: Domain id=20 name='instance-0000001b' uuid=5b094b64-8b68-4676-b259-85b0d42a9ec1 is tainted: custom-monitor
Oct 07 22:18:00 compute-0 nova_compute[192716]: 2025-10-07 22:18:00.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:01 compute-0 nova_compute[192716]: 2025-10-07 22:18:01.095 2 INFO nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: ERROR   22:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: ERROR   22:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: ERROR   22:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: ERROR   22:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: ERROR   22:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:18:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:18:01 compute-0 unix_chkpwd[226519]: password check failed for user (root)
Oct 07 22:18:02 compute-0 nova_compute[192716]: 2025-10-07 22:18:02.101 2 INFO nova.virt.libvirt.driver [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:18:02 compute-0 nova_compute[192716]: 2025-10-07 22:18:02.108 2 DEBUG nova.compute.manager [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:18:02 compute-0 nova_compute[192716]: 2025-10-07 22:18:02.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:02 compute-0 nova_compute[192716]: 2025-10-07 22:18:02.620 2 DEBUG nova.objects.instance [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:18:03 compute-0 nova_compute[192716]: 2025-10-07 22:18:03.637 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:03 compute-0 sshd-session[226489]: Failed password for root from 193.46.255.244 port 64770 ssh2
Oct 07 22:18:04 compute-0 sshd-session[226489]: Received disconnect from 193.46.255.244 port 64770:11:  [preauth]
Oct 07 22:18:04 compute-0 sshd-session[226489]: Disconnected from authenticating user root 193.46.255.244 port 64770 [preauth]
Oct 07 22:18:04 compute-0 sshd-session[226489]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 22:18:04 compute-0 nova_compute[192716]: 2025-10-07 22:18:04.535 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:04 compute-0 nova_compute[192716]: 2025-10-07 22:18:04.536 2 WARNING neutronclient.v2_0.client [None req-1ab951f0-9a09-4b0f-9121-24d6f6453a08 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:04 compute-0 unix_chkpwd[226522]: password check failed for user (root)
Oct 07 22:18:04 compute-0 sshd-session[226520]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 22:18:05 compute-0 nova_compute[192716]: 2025-10-07 22:18:05.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:05 compute-0 podman[226524]: 2025-10-07 22:18:05.860812379 +0000 UTC m=+0.085571495 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 07 22:18:05 compute-0 podman[226523]: 2025-10-07 22:18:05.89094836 +0000 UTC m=+0.128646174 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 07 22:18:07 compute-0 nova_compute[192716]: 2025-10-07 22:18:07.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:07 compute-0 sshd-session[226520]: Failed password for root from 193.46.255.244 port 63238 ssh2
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.201 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "5b094b64-8b68-4676-b259-85b0d42a9ec1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.202 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.202 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.203 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.203 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.218 2 INFO nova.compute.manager [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Terminating instance
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.735 2 DEBUG nova.compute.manager [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:18:08 compute-0 kernel: tapa85533b2-ea (unregistering): left promiscuous mode
Oct 07 22:18:08 compute-0 NetworkManager[51722]: <info>  [1759875488.7559] device (tapa85533b2-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:08 compute-0 ovn_controller[94904]: 2025-10-07T22:18:08Z|00236|binding|INFO|Releasing lport a85533b2-ea61-40a6-a339-d4654846b93d from this chassis (sb_readonly=0)
Oct 07 22:18:08 compute-0 ovn_controller[94904]: 2025-10-07T22:18:08Z|00237|binding|INFO|Setting lport a85533b2-ea61-40a6-a339-d4654846b93d down in Southbound
Oct 07 22:18:08 compute-0 ovn_controller[94904]: 2025-10-07T22:18:08Z|00238|binding|INFO|Removing iface tapa85533b2-ea ovn-installed in OVS
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.778 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:0d:ee 10.100.0.9'], port_security=['fa:16:3e:a6:0d:ee 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5b094b64-8b68-4676-b259-85b0d42a9ec1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '15', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=a85533b2-ea61-40a6-a339-d4654846b93d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.780 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a85533b2-ea61-40a6-a339-d4654846b93d in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.781 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.801 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[414c6473-ecda-4625-94c2-049db268fbae]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Oct 07 22:18:08 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Consumed 2.685s CPU time.
Oct 07 22:18:08 compute-0 systemd-machined[152719]: Machine qemu-20-instance-0000001b terminated.
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.845 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a0d815-ab80-47ff-b579-eb70c95ea6ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.848 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9e9df3-6d58-4c62-93eb-ffa6385c0292]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.892 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad8bbdb-a8b3-4757-b7a8-f27d095c6f38]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.910 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[30ef92d0-fe2b-42b2-aad0-2b8ebb4eba4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517633, 'reachable_time': 17872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226577, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.928 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[03818c0e-473e-4a11-af8e-acf546b32c8b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517650, 'tstamp': 517650}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226578, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517654, 'tstamp': 517654}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226578, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.930 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.940 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe7be80-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.941 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.942 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafe7be80-c0, col_values=(('external_ids', {'iface-id': 'b656ca07-6e70-4919-b525-077e26d9c217'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.942 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:18:08 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:08.943 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ff930066-343b-472f-83a0-4d7c3f09fca0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-afe7be80-c16b-4cef-89c4-8851641c6faf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID afe7be80-c16b-4cef-89c4-8851641c6faf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.969 2 DEBUG nova.compute.manager [req-779fd1d6-027c-4799-822b-4e98a7f5c826 req-10f08d69-7c09-4691-aec4-48df44f58db8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Received event network-vif-unplugged-a85533b2-ea61-40a6-a339-d4654846b93d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.970 2 DEBUG oslo_concurrency.lockutils [req-779fd1d6-027c-4799-822b-4e98a7f5c826 req-10f08d69-7c09-4691-aec4-48df44f58db8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.970 2 DEBUG oslo_concurrency.lockutils [req-779fd1d6-027c-4799-822b-4e98a7f5c826 req-10f08d69-7c09-4691-aec4-48df44f58db8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.970 2 DEBUG oslo_concurrency.lockutils [req-779fd1d6-027c-4799-822b-4e98a7f5c826 req-10f08d69-7c09-4691-aec4-48df44f58db8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.970 2 DEBUG nova.compute.manager [req-779fd1d6-027c-4799-822b-4e98a7f5c826 req-10f08d69-7c09-4691-aec4-48df44f58db8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] No waiting events found dispatching network-vif-unplugged-a85533b2-ea61-40a6-a339-d4654846b93d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:18:08 compute-0 nova_compute[192716]: 2025-10-07 22:18:08.970 2 DEBUG nova.compute.manager [req-779fd1d6-027c-4799-822b-4e98a7f5c826 req-10f08d69-7c09-4691-aec4-48df44f58db8 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Received event network-vif-unplugged-a85533b2-ea61-40a6-a339-d4654846b93d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.001 2 INFO nova.virt.libvirt.driver [-] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Instance destroyed successfully.
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.001 2 DEBUG nova.objects.instance [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lazy-loading 'resources' on Instance uuid 5b094b64-8b68-4676-b259-85b0d42a9ec1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:18:09 compute-0 unix_chkpwd[226597]: password check failed for user (root)
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.509 2 DEBUG nova.virt.libvirt.vif [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:16:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137573810',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137573',id=27,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:16:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-p0cxtqp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:18:03Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=5b094b64-8b68-4676-b259-85b0d42a9ec1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a85533b2-ea61-40a6-a339-d4654846b93d", "address": "fa:16:3e:a6:0d:ee", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa85533b2-ea", "ovs_interfaceid": "a85533b2-ea61-40a6-a339-d4654846b93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.509 2 DEBUG nova.network.os_vif_util [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converting VIF {"id": "a85533b2-ea61-40a6-a339-d4654846b93d", "address": "fa:16:3e:a6:0d:ee", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa85533b2-ea", "ovs_interfaceid": "a85533b2-ea61-40a6-a339-d4654846b93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.511 2 DEBUG nova.network.os_vif_util [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a6:0d:ee,bridge_name='br-int',has_traffic_filtering=True,id=a85533b2-ea61-40a6-a339-d4654846b93d,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa85533b2-ea') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.511 2 DEBUG os_vif [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:0d:ee,bridge_name='br-int',has_traffic_filtering=True,id=a85533b2-ea61-40a6-a339-d4654846b93d,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa85533b2-ea') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa85533b2-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.520 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4655fbed-9aa2-4ae6-bec3-05cfcb44cf83) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.526 2 INFO os_vif [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:0d:ee,bridge_name='br-int',has_traffic_filtering=True,id=a85533b2-ea61-40a6-a339-d4654846b93d,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa85533b2-ea')
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.526 2 INFO nova.virt.libvirt.driver [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Deleting instance files /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1_del
Oct 07 22:18:09 compute-0 nova_compute[192716]: 2025-10-07 22:18:09.528 2 INFO nova.virt.libvirt.driver [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Deletion of /var/lib/nova/instances/5b094b64-8b68-4676-b259-85b0d42a9ec1_del complete
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.080 2 INFO nova.compute.manager [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Took 1.34 seconds to destroy the instance on the hypervisor.
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.081 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.081 2 DEBUG nova.compute.manager [-] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.081 2 DEBUG nova.network.neutron [-] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.082 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.497 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:10 compute-0 nova_compute[192716]: 2025-10-07 22:18:10.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.046 2 DEBUG nova.compute.manager [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Received event network-vif-unplugged-a85533b2-ea61-40a6-a339-d4654846b93d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.046 2 DEBUG oslo_concurrency.lockutils [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.046 2 DEBUG oslo_concurrency.lockutils [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.047 2 DEBUG oslo_concurrency.lockutils [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.047 2 DEBUG nova.compute.manager [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] No waiting events found dispatching network-vif-unplugged-a85533b2-ea61-40a6-a339-d4654846b93d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.047 2 DEBUG nova.compute.manager [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Received event network-vif-unplugged-a85533b2-ea61-40a6-a339-d4654846b93d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.047 2 DEBUG nova.compute.manager [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Received event network-vif-deleted-a85533b2-ea61-40a6-a339-d4654846b93d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.047 2 INFO nova.compute.manager [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Neutron deleted interface a85533b2-ea61-40a6-a339-d4654846b93d; detaching it from the instance and deleting it from the info cache
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.048 2 DEBUG nova.network.neutron [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:18:11 compute-0 sshd-session[226520]: Failed password for root from 193.46.255.244 port 63238 ssh2
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.320 2 DEBUG nova.network.neutron [-] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.557 2 DEBUG nova.compute.manager [req-c43a1532-2bc8-46d8-8928-44c0106ee427 req-c7953368-6209-413c-b969-964ce4a1c53f 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Detach interface failed, port_id=a85533b2-ea61-40a6-a339-d4654846b93d, reason: Instance 5b094b64-8b68-4676-b259-85b0d42a9ec1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:18:11 compute-0 nova_compute[192716]: 2025-10-07 22:18:11.836 2 INFO nova.compute.manager [-] [instance: 5b094b64-8b68-4676-b259-85b0d42a9ec1] Took 1.75 seconds to deallocate network for instance.
Oct 07 22:18:11 compute-0 podman[226598]: 2025-10-07 22:18:11.845388671 +0000 UTC m=+0.080903468 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Oct 07 22:18:12 compute-0 nova_compute[192716]: 2025-10-07 22:18:12.360 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:12 compute-0 nova_compute[192716]: 2025-10-07 22:18:12.363 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:12 compute-0 nova_compute[192716]: 2025-10-07 22:18:12.370 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:12 compute-0 nova_compute[192716]: 2025-10-07 22:18:12.412 2 INFO nova.scheduler.client.report [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Deleted allocations for instance 5b094b64-8b68-4676-b259-85b0d42a9ec1
Oct 07 22:18:13 compute-0 unix_chkpwd[226619]: password check failed for user (root)
Oct 07 22:18:13 compute-0 nova_compute[192716]: 2025-10-07 22:18:13.447 2 DEBUG oslo_concurrency.lockutils [None req-110e4d0c-95b4-4dec-ab56-6faa5e1a333c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "5b094b64-8b68-4676-b259-85b0d42a9ec1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.245s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.214 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "e7c02487-5b40-4170-ba2a-9c025aae40b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.215 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.216 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.216 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.217 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.233 2 INFO nova.compute.manager [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Terminating instance
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.748 2 DEBUG nova.compute.manager [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:18:14 compute-0 kernel: tapcf1b9092-14 (unregistering): left promiscuous mode
Oct 07 22:18:14 compute-0 NetworkManager[51722]: <info>  [1759875494.7760] device (tapcf1b9092-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:14 compute-0 ovn_controller[94904]: 2025-10-07T22:18:14Z|00239|binding|INFO|Releasing lport cf1b9092-14e3-4326-99c8-2801b53abb26 from this chassis (sb_readonly=0)
Oct 07 22:18:14 compute-0 ovn_controller[94904]: 2025-10-07T22:18:14Z|00240|binding|INFO|Setting lport cf1b9092-14e3-4326-99c8-2801b53abb26 down in Southbound
Oct 07 22:18:14 compute-0 ovn_controller[94904]: 2025-10-07T22:18:14Z|00241|binding|INFO|Removing iface tapcf1b9092-14 ovn-installed in OVS
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:14.791 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:2c:cf 10.100.0.14'], port_security=['fa:16:3e:6c:2c:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e7c02487-5b40-4170-ba2a-9c025aae40b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '16', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=cf1b9092-14e3-4326-99c8-2801b53abb26) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:18:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:14.792 103791 INFO neutron.agent.ovn.metadata.agent [-] Port cf1b9092-14e3-4326-99c8-2801b53abb26 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:18:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:14.793 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network afe7be80-c16b-4cef-89c4-8851641c6faf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:18:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:14.795 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[86bbec36-95b0-4822-91c8-ccedd2d30377]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:14 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:14.796 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf namespace which is not needed anymore
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:14 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Oct 07 22:18:14 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Consumed 3.892s CPU time.
Oct 07 22:18:14 compute-0 systemd-machined[152719]: Machine qemu-19-instance-0000001a terminated.
Oct 07 22:18:14 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [NOTICE]   (226253) : haproxy version is 3.0.5-8e879a5
Oct 07 22:18:14 compute-0 podman[226643]: 2025-10-07 22:18:14.903605871 +0000 UTC m=+0.029171004 container kill 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:18:14 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [NOTICE]   (226253) : path to executable is /usr/sbin/haproxy
Oct 07 22:18:14 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [WARNING]  (226253) : Exiting Master process...
Oct 07 22:18:14 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [ALERT]    (226253) : Current worker (226260) exited with code 143 (Terminated)
Oct 07 22:18:14 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[226218]: [WARNING]  (226253) : All workers exited. Exiting... (0)
Oct 07 22:18:14 compute-0 systemd[1]: libpod-734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7.scope: Deactivated successfully.
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.934 2 DEBUG nova.compute.manager [req-0b37c416-5931-47ed-83ad-a367af51a5ba req-c35e7148-a236-4bc1-8523-070923ec6098 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Received event network-vif-unplugged-cf1b9092-14e3-4326-99c8-2801b53abb26 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.935 2 DEBUG oslo_concurrency.lockutils [req-0b37c416-5931-47ed-83ad-a367af51a5ba req-c35e7148-a236-4bc1-8523-070923ec6098 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.935 2 DEBUG oslo_concurrency.lockutils [req-0b37c416-5931-47ed-83ad-a367af51a5ba req-c35e7148-a236-4bc1-8523-070923ec6098 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.936 2 DEBUG oslo_concurrency.lockutils [req-0b37c416-5931-47ed-83ad-a367af51a5ba req-c35e7148-a236-4bc1-8523-070923ec6098 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.936 2 DEBUG nova.compute.manager [req-0b37c416-5931-47ed-83ad-a367af51a5ba req-c35e7148-a236-4bc1-8523-070923ec6098 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] No waiting events found dispatching network-vif-unplugged-cf1b9092-14e3-4326-99c8-2801b53abb26 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:18:14 compute-0 nova_compute[192716]: 2025-10-07 22:18:14.936 2 DEBUG nova.compute.manager [req-0b37c416-5931-47ed-83ad-a367af51a5ba req-c35e7148-a236-4bc1-8523-070923ec6098 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Received event network-vif-unplugged-cf1b9092-14e3-4326-99c8-2801b53abb26 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:18:14 compute-0 podman[226658]: 2025-10-07 22:18:14.948179295 +0000 UTC m=+0.023444097 container died 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:18:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7-userdata-shm.mount: Deactivated successfully.
Oct 07 22:18:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-311fc502461087d7dd92bf90ab2f746dfa2e44dacded5f03a3b714f6fa7769ab-merged.mount: Deactivated successfully.
Oct 07 22:18:14 compute-0 podman[226658]: 2025-10-07 22:18:14.984154478 +0000 UTC m=+0.059419260 container cleanup 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007)
Oct 07 22:18:14 compute-0 systemd[1]: libpod-conmon-734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7.scope: Deactivated successfully.
Oct 07 22:18:15 compute-0 podman[226660]: 2025-10-07 22:18:15.001714851 +0000 UTC m=+0.074211432 container remove 734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.005 2 INFO nova.virt.libvirt.driver [-] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Instance destroyed successfully.
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.005 2 DEBUG nova.objects.instance [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lazy-loading 'resources' on Instance uuid e7c02487-5b40-4170-ba2a-9c025aae40b9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.008 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1be2e679-ddc1-4662-8010-acdebbb7e96f]: (4, ("Tue Oct  7 10:18:14 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf (734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7)\n734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7\nTue Oct  7 10:18:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf (734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7)\n734fbdf485fe7e3165df00b025c3347df77c1c9e515fbc0332456995f3e0f0d7\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.009 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[75f2b928-935e-4d41-8feb-80f541bac467]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.009 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.010 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b480cde8-d6d3-43af-80e6-438a50620a24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.010 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:15 compute-0 kernel: tapafe7be80-c0: left promiscuous mode
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.028 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[84ccbcd1-61c5-4f85-9c22-582365372a93]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.060 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[eddcfea4-d144-46f0-a8d7-21a69adbe5cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.061 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[c51ac7b3-3321-4e46-aaa1-7677d45c1bca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.079 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[60aabf90-e1c6-497b-8c45-1754a9913b2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517624, 'reachable_time': 15083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226709, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.081 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:18:15 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:15.081 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[2c44dff1-58d8-4de0-850d-a50d78261d68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:18:15 compute-0 systemd[1]: run-netns-ovnmeta\x2dafe7be80\x2dc16b\x2d4cef\x2d89c4\x2d8851641c6faf.mount: Deactivated successfully.
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.511 2 DEBUG nova.virt.libvirt.vif [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1445845716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1445845',id=26,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:16:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-u6ansgmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:17:32Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=e7c02487-5b40-4170-ba2a-9c025aae40b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cf1b9092-14e3-4326-99c8-2801b53abb26", "address": "fa:16:3e:6c:2c:cf", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf1b9092-14", "ovs_interfaceid": "cf1b9092-14e3-4326-99c8-2801b53abb26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.511 2 DEBUG nova.network.os_vif_util [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converting VIF {"id": "cf1b9092-14e3-4326-99c8-2801b53abb26", "address": "fa:16:3e:6c:2c:cf", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf1b9092-14", "ovs_interfaceid": "cf1b9092-14e3-4326-99c8-2801b53abb26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.512 2 DEBUG nova.network.os_vif_util [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:2c:cf,bridge_name='br-int',has_traffic_filtering=True,id=cf1b9092-14e3-4326-99c8-2801b53abb26,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf1b9092-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.513 2 DEBUG os_vif [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:2c:cf,bridge_name='br-int',has_traffic_filtering=True,id=cf1b9092-14e3-4326-99c8-2801b53abb26,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf1b9092-14') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.515 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf1b9092-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.578 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=51fa3eb2-22be-46c0-b3da-fe2523e59f24) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.583 2 INFO os_vif [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:2c:cf,bridge_name='br-int',has_traffic_filtering=True,id=cf1b9092-14e3-4326-99c8-2801b53abb26,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf1b9092-14')
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.583 2 INFO nova.virt.libvirt.driver [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Deleting instance files /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9_del
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.584 2 INFO nova.virt.libvirt.driver [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Deletion of /var/lib/nova/instances/e7c02487-5b40-4170-ba2a-9c025aae40b9_del complete
Oct 07 22:18:15 compute-0 sshd-session[226520]: Failed password for root from 193.46.255.244 port 63238 ssh2
Oct 07 22:18:15 compute-0 nova_compute[192716]: 2025-10-07 22:18:15.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.094 2 INFO nova.compute.manager [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Took 1.35 seconds to destroy the instance on the hypervisor.
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.095 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.095 2 DEBUG nova.compute.manager [-] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.095 2 DEBUG nova.network.neutron [-] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.095 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.526 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.918 2 DEBUG nova.compute.manager [req-41bd813f-bfb7-42a5-852d-e7c9e06b9f50 req-ee561cb7-d537-4e69-8141-7afdabc58ee4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Received event network-vif-deleted-cf1b9092-14e3-4326-99c8-2801b53abb26 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.919 2 INFO nova.compute.manager [req-41bd813f-bfb7-42a5-852d-e7c9e06b9f50 req-ee561cb7-d537-4e69-8141-7afdabc58ee4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Neutron deleted interface cf1b9092-14e3-4326-99c8-2801b53abb26; detaching it from the instance and deleting it from the info cache
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.920 2 DEBUG nova.network.neutron [req-41bd813f-bfb7-42a5-852d-e7c9e06b9f50 req-ee561cb7-d537-4e69-8141-7afdabc58ee4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.986 2 DEBUG nova.compute.manager [req-b18856cf-4ad6-409f-ae58-2649b6beba84 req-478ad8ac-1a99-4b4d-9db6-709bcccfe106 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Received event network-vif-unplugged-cf1b9092-14e3-4326-99c8-2801b53abb26 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.987 2 DEBUG oslo_concurrency.lockutils [req-b18856cf-4ad6-409f-ae58-2649b6beba84 req-478ad8ac-1a99-4b4d-9db6-709bcccfe106 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.987 2 DEBUG oslo_concurrency.lockutils [req-b18856cf-4ad6-409f-ae58-2649b6beba84 req-478ad8ac-1a99-4b4d-9db6-709bcccfe106 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.987 2 DEBUG oslo_concurrency.lockutils [req-b18856cf-4ad6-409f-ae58-2649b6beba84 req-478ad8ac-1a99-4b4d-9db6-709bcccfe106 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.987 2 DEBUG nova.compute.manager [req-b18856cf-4ad6-409f-ae58-2649b6beba84 req-478ad8ac-1a99-4b4d-9db6-709bcccfe106 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] No waiting events found dispatching network-vif-unplugged-cf1b9092-14e3-4326-99c8-2801b53abb26 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:18:16 compute-0 nova_compute[192716]: 2025-10-07 22:18:16.988 2 DEBUG nova.compute.manager [req-b18856cf-4ad6-409f-ae58-2649b6beba84 req-478ad8ac-1a99-4b4d-9db6-709bcccfe106 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Received event network-vif-unplugged-cf1b9092-14e3-4326-99c8-2801b53abb26 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:18:17 compute-0 nova_compute[192716]: 2025-10-07 22:18:17.349 2 DEBUG nova.network.neutron [-] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:18:17 compute-0 nova_compute[192716]: 2025-10-07 22:18:17.430 2 DEBUG nova.compute.manager [req-41bd813f-bfb7-42a5-852d-e7c9e06b9f50 req-ee561cb7-d537-4e69-8141-7afdabc58ee4 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Detach interface failed, port_id=cf1b9092-14e3-4326-99c8-2801b53abb26, reason: Instance e7c02487-5b40-4170-ba2a-9c025aae40b9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:18:17 compute-0 sshd-session[226520]: Received disconnect from 193.46.255.244 port 63238:11:  [preauth]
Oct 07 22:18:17 compute-0 sshd-session[226520]: Disconnected from authenticating user root 193.46.255.244 port 63238 [preauth]
Oct 07 22:18:17 compute-0 sshd-session[226520]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 22:18:17 compute-0 nova_compute[192716]: 2025-10-07 22:18:17.860 2 INFO nova.compute.manager [-] [instance: e7c02487-5b40-4170-ba2a-9c025aae40b9] Took 1.76 seconds to deallocate network for instance.
Oct 07 22:18:18 compute-0 nova_compute[192716]: 2025-10-07 22:18:18.383 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:18 compute-0 nova_compute[192716]: 2025-10-07 22:18:18.384 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:18 compute-0 unix_chkpwd[226712]: password check failed for user (root)
Oct 07 22:18:18 compute-0 sshd-session[226710]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 22:18:18 compute-0 nova_compute[192716]: 2025-10-07 22:18:18.450 2 DEBUG nova.compute.provider_tree [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:18:18 compute-0 nova_compute[192716]: 2025-10-07 22:18:18.959 2 DEBUG nova.scheduler.client.report [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:18:19 compute-0 nova_compute[192716]: 2025-10-07 22:18:19.470 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:19 compute-0 nova_compute[192716]: 2025-10-07 22:18:19.542 2 INFO nova.scheduler.client.report [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Deleted allocations for instance e7c02487-5b40-4170-ba2a-9c025aae40b9
Oct 07 22:18:20 compute-0 sshd-session[226710]: Failed password for root from 193.46.255.244 port 14482 ssh2
Oct 07 22:18:20 compute-0 unix_chkpwd[226714]: password check failed for user (root)
Oct 07 22:18:20 compute-0 nova_compute[192716]: 2025-10-07 22:18:20.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:20 compute-0 nova_compute[192716]: 2025-10-07 22:18:20.584 2 DEBUG oslo_concurrency.lockutils [None req-3a5a33f1-0395-4f90-8b50-b8e80ee4ee3c a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "e7c02487-5b40-4170-ba2a-9c025aae40b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.368s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:20 compute-0 nova_compute[192716]: 2025-10-07 22:18:20.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:22 compute-0 sshd-session[226710]: Failed password for root from 193.46.255.244 port 14482 ssh2
Oct 07 22:18:22 compute-0 unix_chkpwd[226715]: password check failed for user (root)
Oct 07 22:18:24 compute-0 podman[226716]: 2025-10-07 22:18:24.83699495 +0000 UTC m=+0.075085438 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 22:18:24 compute-0 podman[226717]: 2025-10-07 22:18:24.837491664 +0000 UTC m=+0.074210302 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:18:25 compute-0 sshd-session[226710]: Failed password for root from 193.46.255.244 port 14482 ssh2
Oct 07 22:18:25 compute-0 nova_compute[192716]: 2025-10-07 22:18:25.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:25.658 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:25.658 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:25.658 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:25 compute-0 nova_compute[192716]: 2025-10-07 22:18:25.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:26 compute-0 sshd-session[226710]: Received disconnect from 193.46.255.244 port 14482:11:  [preauth]
Oct 07 22:18:26 compute-0 sshd-session[226710]: Disconnected from authenticating user root 193.46.255.244 port 14482 [preauth]
Oct 07 22:18:26 compute-0 sshd-session[226710]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 07 22:18:28 compute-0 podman[226756]: 2025-10-07 22:18:28.832624373 +0000 UTC m=+0.071314736 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:18:29 compute-0 podman[203153]: time="2025-10-07T22:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:18:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:18:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Oct 07 22:18:30 compute-0 nova_compute[192716]: 2025-10-07 22:18:30.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:30 compute-0 nova_compute[192716]: 2025-10-07 22:18:30.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: ERROR   22:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: ERROR   22:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: ERROR   22:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: ERROR   22:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: ERROR   22:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:18:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:18:35 compute-0 nova_compute[192716]: 2025-10-07 22:18:35.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:35 compute-0 nova_compute[192716]: 2025-10-07 22:18:35.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:36.690 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:18:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:36.691 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:18:36 compute-0 nova_compute[192716]: 2025-10-07 22:18:36.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:36 compute-0 podman[226784]: 2025-10-07 22:18:36.82450479 +0000 UTC m=+0.065190968 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:18:36 compute-0 podman[226783]: 2025-10-07 22:18:36.880245661 +0000 UTC m=+0.119091735 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 22:18:39 compute-0 nova_compute[192716]: 2025-10-07 22:18:39.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:39 compute-0 nova_compute[192716]: 2025-10-07 22:18:39.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:18:40 compute-0 nova_compute[192716]: 2025-10-07 22:18:40.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:40 compute-0 nova_compute[192716]: 2025-10-07 22:18:40.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:42 compute-0 podman[226830]: 2025-10-07 22:18:42.841066471 +0000 UTC m=+0.082464604 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:18:42 compute-0 nova_compute[192716]: 2025-10-07 22:18:42.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:45 compute-0 nova_compute[192716]: 2025-10-07 22:18:45.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:45 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:18:45.693 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:18:45 compute-0 nova_compute[192716]: 2025-10-07 22:18:45.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:45 compute-0 nova_compute[192716]: 2025-10-07 22:18:45.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:47 compute-0 nova_compute[192716]: 2025-10-07 22:18:47.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.506 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.507 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.693 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.695 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.738 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.738 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.29896545410156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.739 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:18:48 compute-0 nova_compute[192716]: 2025-10-07 22:18:48.739 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:18:49 compute-0 nova_compute[192716]: 2025-10-07 22:18:49.781 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:18:49 compute-0 nova_compute[192716]: 2025-10-07 22:18:49.782 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:18:48 up  1:27,  0 user,  load average: 0.08, 0.11, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:18:49 compute-0 nova_compute[192716]: 2025-10-07 22:18:49.803 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:18:50 compute-0 nova_compute[192716]: 2025-10-07 22:18:50.312 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:18:50 compute-0 nova_compute[192716]: 2025-10-07 22:18:50.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:50 compute-0 nova_compute[192716]: 2025-10-07 22:18:50.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:50 compute-0 nova_compute[192716]: 2025-10-07 22:18:50.828 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:18:50 compute-0 nova_compute[192716]: 2025-10-07 22:18:50.828 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:18:51 compute-0 nova_compute[192716]: 2025-10-07 22:18:51.829 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:54 compute-0 nova_compute[192716]: 2025-10-07 22:18:54.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:54 compute-0 nova_compute[192716]: 2025-10-07 22:18:54.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:55 compute-0 nova_compute[192716]: 2025-10-07 22:18:55.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:55 compute-0 nova_compute[192716]: 2025-10-07 22:18:55.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:18:55 compute-0 podman[226853]: 2025-10-07 22:18:55.847256735 +0000 UTC m=+0.081151675 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 22:18:55 compute-0 podman[226854]: 2025-10-07 22:18:55.851225421 +0000 UTC m=+0.080566158 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:18:56 compute-0 nova_compute[192716]: 2025-10-07 22:18:56.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:18:59 compute-0 podman[203153]: time="2025-10-07T22:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:18:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:18:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Oct 07 22:18:59 compute-0 podman[226891]: 2025-10-07 22:18:59.845553747 +0000 UTC m=+0.080500426 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:19:00 compute-0 nova_compute[192716]: 2025-10-07 22:19:00.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:00 compute-0 nova_compute[192716]: 2025-10-07 22:19:00.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: ERROR   22:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: ERROR   22:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: ERROR   22:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: ERROR   22:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: ERROR   22:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:19:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:19:05 compute-0 nova_compute[192716]: 2025-10-07 22:19:05.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:05 compute-0 nova_compute[192716]: 2025-10-07 22:19:05.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:07 compute-0 podman[226917]: 2025-10-07 22:19:07.83112698 +0000 UTC m=+0.059915844 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct 07 22:19:07 compute-0 podman[226916]: 2025-10-07 22:19:07.874031556 +0000 UTC m=+0.107847977 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:19:10 compute-0 nova_compute[192716]: 2025-10-07 22:19:10.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:10 compute-0 nova_compute[192716]: 2025-10-07 22:19:10.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:13 compute-0 podman[226959]: 2025-10-07 22:19:13.826626991 +0000 UTC m=+0.063331903 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Oct 07 22:19:15 compute-0 nova_compute[192716]: 2025-10-07 22:19:15.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:15 compute-0 nova_compute[192716]: 2025-10-07 22:19:15.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:20 compute-0 nova_compute[192716]: 2025-10-07 22:19:20.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:20 compute-0 nova_compute[192716]: 2025-10-07 22:19:20.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:25 compute-0 nova_compute[192716]: 2025-10-07 22:19:25.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:25.659 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:19:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:25.659 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:19:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:25.660 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:19:25 compute-0 nova_compute[192716]: 2025-10-07 22:19:25.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:26 compute-0 podman[226982]: 2025-10-07 22:19:26.835982187 +0000 UTC m=+0.074666025 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 07 22:19:26 compute-0 podman[226983]: 2025-10-07 22:19:26.834938397 +0000 UTC m=+0.074078519 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:19:28 compute-0 ovn_controller[94904]: 2025-10-07T22:19:28Z|00242|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Oct 07 22:19:29 compute-0 podman[203153]: time="2025-10-07T22:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:19:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:19:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 07 22:19:30 compute-0 nova_compute[192716]: 2025-10-07 22:19:30.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:30 compute-0 podman[227019]: 2025-10-07 22:19:30.833565129 +0000 UTC m=+0.077194489 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:19:30 compute-0 nova_compute[192716]: 2025-10-07 22:19:30.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: ERROR   22:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: ERROR   22:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: ERROR   22:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: ERROR   22:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: ERROR   22:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:19:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:19:32 compute-0 nova_compute[192716]: 2025-10-07 22:19:32.118 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Creating tmpfile /var/lib/nova/instances/tmpltgy2cpu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:19:32 compute-0 nova_compute[192716]: 2025-10-07 22:19:32.119 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:32 compute-0 nova_compute[192716]: 2025-10-07 22:19:32.120 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Creating tmpfile /var/lib/nova/instances/tmpr458mqn9 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:19:32 compute-0 nova_compute[192716]: 2025-10-07 22:19:32.121 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:32 compute-0 nova_compute[192716]: 2025-10-07 22:19:32.124 2 DEBUG nova.compute.manager [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpltgy2cpu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:19:32 compute-0 nova_compute[192716]: 2025-10-07 22:19:32.128 2 DEBUG nova.compute.manager [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr458mqn9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:19:34 compute-0 nova_compute[192716]: 2025-10-07 22:19:34.175 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:34 compute-0 nova_compute[192716]: 2025-10-07 22:19:34.211 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:35 compute-0 nova_compute[192716]: 2025-10-07 22:19:35.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:35 compute-0 nova_compute[192716]: 2025-10-07 22:19:35.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:38 compute-0 nova_compute[192716]: 2025-10-07 22:19:38.460 2 DEBUG nova.compute.manager [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpltgy2cpu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:19:38 compute-0 podman[227045]: 2025-10-07 22:19:38.872296967 +0000 UTC m=+0.097880235 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 22:19:38 compute-0 podman[227044]: 2025-10-07 22:19:38.880921409 +0000 UTC m=+0.110084382 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 07 22:19:39 compute-0 nova_compute[192716]: 2025-10-07 22:19:39.475 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:19:39 compute-0 nova_compute[192716]: 2025-10-07 22:19:39.476 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:19:39 compute-0 nova_compute[192716]: 2025-10-07 22:19:39.476 2 DEBUG nova.network.neutron [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:19:39 compute-0 nova_compute[192716]: 2025-10-07 22:19:39.983 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:40 compute-0 nova_compute[192716]: 2025-10-07 22:19:40.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:40 compute-0 nova_compute[192716]: 2025-10-07 22:19:40.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:40 compute-0 nova_compute[192716]: 2025-10-07 22:19:40.934 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:40 compute-0 nova_compute[192716]: 2025-10-07 22:19:40.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:40 compute-0 nova_compute[192716]: 2025-10-07 22:19:40.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.071 2 DEBUG nova.network.neutron [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Updating instance_info_cache with network_info: [{"id": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "address": "fa:16:3e:46:9e:e5", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0c4ab2-40", "ovs_interfaceid": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.578 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.594 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpltgy2cpu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.595 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Creating instance directory: /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.595 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Creating disk.info with the contents: {'/var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk': 'qcow2', '/var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.596 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:19:41 compute-0 nova_compute[192716]: 2025-10-07 22:19:41.596 2 DEBUG nova.objects.instance [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.102 2 DEBUG oslo_utils.imageutils.format_inspector [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.106 2 DEBUG oslo_utils.imageutils.format_inspector [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.108 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.167 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.169 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.170 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.171 2 DEBUG oslo_utils.imageutils.format_inspector [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.175 2 DEBUG oslo_utils.imageutils.format_inspector [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.176 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.234 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.235 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.277 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.279 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.280 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.338 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.340 2 DEBUG nova.virt.disk.api [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.340 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.401 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.402 2 DEBUG nova.virt.disk.api [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.402 2 DEBUG nova.objects.instance [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.909 2 DEBUG nova.objects.base [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.910 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.942 2 DEBUG oslo_concurrency.processutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk.config 497664" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.943 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.944 2 DEBUG nova.virt.libvirt.vif [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-265958160',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-2659581',id=28,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:18:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-z0jsmwo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:18:38Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "address": "fa:16:3e:46:9e:e5", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapda0c4ab2-40", "ovs_interfaceid": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.944 2 DEBUG nova.network.os_vif_util [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "address": "fa:16:3e:46:9e:e5", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapda0c4ab2-40", "ovs_interfaceid": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.945 2 DEBUG nova.network.os_vif_util [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:9e:e5,bridge_name='br-int',has_traffic_filtering=True,id=da0c4ab2-4043-426e-a322-76ffcf2e1751,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0c4ab2-40') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.945 2 DEBUG os_vif [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:9e:e5,bridge_name='br-int',has_traffic_filtering=True,id=da0c4ab2-4043-426e-a322-76ffcf2e1751,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0c4ab2-40') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.946 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.947 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.948 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '77ebf47e-c099-55fe-aaac-edacb33c1def', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.959 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda0c4ab2-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapda0c4ab2-40, col_values=(('qos', UUID('92d8c085-ca32-4a97-899c-e6bcd925f2d3')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapda0c4ab2-40, col_values=(('external_ids', {'iface-id': 'da0c4ab2-4043-426e-a322-76ffcf2e1751', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:9e:e5', 'vm-uuid': 'c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:42 compute-0 NetworkManager[51722]: <info>  [1759875582.9631] manager: (tapda0c4ab2-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.969 2 INFO os_vif [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:9e:e5,bridge_name='br-int',has_traffic_filtering=True,id=da0c4ab2-4043-426e-a322-76ffcf2e1751,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0c4ab2-40')
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.970 2 DEBUG nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.970 2 DEBUG nova.compute.manager [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpltgy2cpu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:19:42 compute-0 nova_compute[192716]: 2025-10-07 22:19:42.971 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:43 compute-0 nova_compute[192716]: 2025-10-07 22:19:43.537 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:43.787 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:19:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:43.788 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:19:43 compute-0 nova_compute[192716]: 2025-10-07 22:19:43.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:44 compute-0 nova_compute[192716]: 2025-10-07 22:19:44.616 2 DEBUG nova.network.neutron [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Port da0c4ab2-4043-426e-a322-76ffcf2e1751 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:19:44 compute-0 nova_compute[192716]: 2025-10-07 22:19:44.632 2 DEBUG nova.compute.manager [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpltgy2cpu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:19:44 compute-0 podman[227112]: 2025-10-07 22:19:44.851303685 +0000 UTC m=+0.087239153 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 07 22:19:44 compute-0 nova_compute[192716]: 2025-10-07 22:19:44.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:45 compute-0 nova_compute[192716]: 2025-10-07 22:19:45.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:47 compute-0 kernel: tapda0c4ab2-40: entered promiscuous mode
Oct 07 22:19:47 compute-0 NetworkManager[51722]: <info>  [1759875587.8452] manager: (tapda0c4ab2-40): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct 07 22:19:47 compute-0 ovn_controller[94904]: 2025-10-07T22:19:47Z|00243|binding|INFO|Claiming lport da0c4ab2-4043-426e-a322-76ffcf2e1751 for this additional chassis.
Oct 07 22:19:47 compute-0 ovn_controller[94904]: 2025-10-07T22:19:47Z|00244|binding|INFO|da0c4ab2-4043-426e-a322-76ffcf2e1751: Claiming fa:16:3e:46:9e:e5 10.100.0.10
Oct 07 22:19:47 compute-0 nova_compute[192716]: 2025-10-07 22:19:47.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.853 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:9e:e5 10.100.0.10'], port_security=['fa:16:3e:46:9e:e5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=da0c4ab2-4043-426e-a322-76ffcf2e1751) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.854 103791 INFO neutron.agent.ovn.metadata.agent [-] Port da0c4ab2-4043-426e-a322-76ffcf2e1751 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.855 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:19:47 compute-0 ovn_controller[94904]: 2025-10-07T22:19:47Z|00245|binding|INFO|Setting lport da0c4ab2-4043-426e-a322-76ffcf2e1751 ovn-installed in OVS
Oct 07 22:19:47 compute-0 nova_compute[192716]: 2025-10-07 22:19:47.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:47 compute-0 nova_compute[192716]: 2025-10-07 22:19:47.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.874 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fa258d94-7d33-48e1-bc89-1f3420c359ac]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.875 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapafe7be80-c1 in ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.877 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapafe7be80-c0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.877 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0355ebf8-9335-4f43-a105-653cad1b0b65]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.878 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9424eccf-ca32-46f1-a5d4-72d10a187ec7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 systemd-udevd[227148]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.892 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cb4255-9d3c-450a-864f-40f691056f61]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 systemd-machined[152719]: New machine qemu-21-instance-0000001c.
Oct 07 22:19:47 compute-0 NetworkManager[51722]: <info>  [1759875587.9014] device (tapda0c4ab2-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.899 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fa313d9e-c393-46cf-9775-478c9af1e5ca]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 NetworkManager[51722]: <info>  [1759875587.9025] device (tapda0c4ab2-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:19:47 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001c.
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.948 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[041a094e-d019-43ae-a3f0-c6cd1caac575]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:47.957 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bcac76a2-4ea2-4325-bdac-588b6c155c0d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:47 compute-0 NetworkManager[51722]: <info>  [1759875587.9596] manager: (tapafe7be80-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Oct 07 22:19:47 compute-0 nova_compute[192716]: 2025-10-07 22:19:47.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:47 compute-0 nova_compute[192716]: 2025-10-07 22:19:47.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:47 compute-0 nova_compute[192716]: 2025-10-07 22:19:47.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.004 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[c5494254-f87b-4fe3-a70b-9306ad2cfd45]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.007 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[00b451da-2912-4bca-86de-55b494e2c3b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 NetworkManager[51722]: <info>  [1759875588.0395] device (tapafe7be80-c0): carrier: link connected
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.048 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[f23ce5fd-8174-4e79-a95c-05d974ef379c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.079 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c4cd06-cdcf-4a20-9716-331a3fe7bc6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532161, 'reachable_time': 21415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227183, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.101 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[11a2668a-d22f-49fd-96e0-c6e2c0c298c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:b46d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532161, 'tstamp': 532161}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227184, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.128 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2b83978c-5e8f-4aaa-acbb-a787652d299f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532161, 'reachable_time': 21415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227185, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 sshd-session[227182]: Connection closed by 164.92.202.181 port 58642
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.168 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[74033e98-abaa-49c2-a0d7-3002769b86ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.252 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[eef41c52-07f4-4b78-8bc0-620341618da2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.253 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.254 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.254 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe7be80-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:48 compute-0 NetworkManager[51722]: <info>  [1759875588.3010] manager: (tapafe7be80-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Oct 07 22:19:48 compute-0 kernel: tapafe7be80-c0: entered promiscuous mode
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.305 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafe7be80-c0, col_values=(('external_ids', {'iface-id': 'b656ca07-6e70-4919-b525-077e26d9c217'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:48 compute-0 ovn_controller[94904]: 2025-10-07T22:19:48Z|00246|binding|INFO|Releasing lport b656ca07-6e70-4919-b525-077e26d9c217 from this chassis (sb_readonly=0)
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.310 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7f6e62-e8ad-474c-a37a-93acd8652bc1]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.311 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.311 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.311 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for afe7be80-c16b-4cef-89c4-8851641c6faf disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.312 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.312 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4e190f-e91f-4a08-a4e9-997eaa09838d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.313 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.313 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[93638a12-59ee-4d14-ba82-81dcda174ae2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.314 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.316 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'env', 'PROCESS_TAG=haproxy-afe7be80-c16b-4cef-89c4-8851641c6faf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/afe7be80-c16b-4cef-89c4-8851641c6faf.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.502 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.503 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.504 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:19:48 compute-0 nova_compute[192716]: 2025-10-07 22:19:48.504 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:19:48 compute-0 podman[227225]: 2025-10-07 22:19:48.728020221 +0000 UTC m=+0.053549518 container create 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 22:19:48 compute-0 systemd[1]: Started libpod-conmon-61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873.scope.
Oct 07 22:19:48 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:19:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ada2d84d1e147d07a035c0874a6255b21503c6f560a20499898ae4d1616faa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:19:48 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:19:48.790 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:19:48 compute-0 podman[227225]: 2025-10-07 22:19:48.699707263 +0000 UTC m=+0.025236580 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:19:48 compute-0 podman[227225]: 2025-10-07 22:19:48.799572194 +0000 UTC m=+0.125101531 container init 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251007)
Oct 07 22:19:48 compute-0 podman[227225]: 2025-10-07 22:19:48.806640781 +0000 UTC m=+0.132170108 container start 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007)
Oct 07 22:19:48 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [NOTICE]   (227246) : New worker (227248) forked
Oct 07 22:19:48 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [NOTICE]   (227246) : Loading success.
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.557 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.612 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.614 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.670 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.810 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.811 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.829 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.829 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5796MB free_disk=73.29821014404297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.829 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:19:49 compute-0 nova_compute[192716]: 2025-10-07 22:19:49.830 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:19:50 compute-0 ovn_controller[94904]: 2025-10-07T22:19:50Z|00247|binding|INFO|Claiming lport da0c4ab2-4043-426e-a322-76ffcf2e1751 for this chassis.
Oct 07 22:19:50 compute-0 ovn_controller[94904]: 2025-10-07T22:19:50Z|00248|binding|INFO|da0c4ab2-4043-426e-a322-76ffcf2e1751: Claiming fa:16:3e:46:9e:e5 10.100.0.10
Oct 07 22:19:50 compute-0 ovn_controller[94904]: 2025-10-07T22:19:50Z|00249|binding|INFO|Setting lport da0c4ab2-4043-426e-a322-76ffcf2e1751 up in Southbound
Oct 07 22:19:50 compute-0 nova_compute[192716]: 2025-10-07 22:19:50.848 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration for instance c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 22:19:50 compute-0 nova_compute[192716]: 2025-10-07 22:19:50.849 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Migration for instance c8726889-9168-439f-8940-764b45f2a9f6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 07 22:19:50 compute-0 nova_compute[192716]: 2025-10-07 22:19:50.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:51 compute-0 sshd-session[227277]: Invalid user david from 103.115.24.11 port 49904
Oct 07 22:19:51 compute-0 sshd-session[227277]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:19:51 compute-0 sshd-session[227277]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:19:51 compute-0 nova_compute[192716]: 2025-10-07 22:19:51.866 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Updating resource usage from migration fb6e5442-9748-4690-a5df-0d1e4279eb0f
Oct 07 22:19:51 compute-0 nova_compute[192716]: 2025-10-07 22:19:51.867 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Starting to track incoming migration fb6e5442-9748-4690-a5df-0d1e4279eb0f with flavor e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 22:19:52 compute-0 nova_compute[192716]: 2025-10-07 22:19:52.381 2 INFO nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Updating resource usage from migration 01b19c84-1670-4441-8285-5e84795ec098
Oct 07 22:19:52 compute-0 nova_compute[192716]: 2025-10-07 22:19:52.382 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Starting to track incoming migration 01b19c84-1670-4441-8285-5e84795ec098 with flavor e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 07 22:19:52 compute-0 nova_compute[192716]: 2025-10-07 22:19:52.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:53 compute-0 sshd-session[227277]: Failed password for invalid user david from 103.115.24.11 port 49904 ssh2
Oct 07 22:19:53 compute-0 nova_compute[192716]: 2025-10-07 22:19:53.433 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance c8726889-9168-439f-8940-764b45f2a9f6 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 07 22:19:53 compute-0 sshd-session[227277]: Received disconnect from 103.115.24.11 port 49904:11: Bye Bye [preauth]
Oct 07 22:19:53 compute-0 sshd-session[227277]: Disconnected from invalid user david 103.115.24.11 port 49904 [preauth]
Oct 07 22:19:53 compute-0 nova_compute[192716]: 2025-10-07 22:19:53.608 2 INFO nova.compute.manager [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Post operation of migration started
Oct 07 22:19:53 compute-0 nova_compute[192716]: 2025-10-07 22:19:53.609 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:53 compute-0 nova_compute[192716]: 2025-10-07 22:19:53.939 2 WARNING nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 07 22:19:53 compute-0 nova_compute[192716]: 2025-10-07 22:19:53.940 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:19:53 compute-0 nova_compute[192716]: 2025-10-07 22:19:53.940 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:19:49 up  1:28,  0 user,  load average: 0.03, 0.09, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.005 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.512 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.559 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.559 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.657 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.658 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:19:54 compute-0 nova_compute[192716]: 2025-10-07 22:19:54.659 2 DEBUG nova.network.neutron [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:19:55 compute-0 nova_compute[192716]: 2025-10-07 22:19:55.024 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:19:55 compute-0 nova_compute[192716]: 2025-10-07 22:19:55.025 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.195s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:19:55 compute-0 nova_compute[192716]: 2025-10-07 22:19:55.166 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:55 compute-0 nova_compute[192716]: 2025-10-07 22:19:55.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:55 compute-0 nova_compute[192716]: 2025-10-07 22:19:55.945 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:19:56 compute-0 nova_compute[192716]: 2025-10-07 22:19:56.119 2 DEBUG nova.network.neutron [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Updating instance_info_cache with network_info: [{"id": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "address": "fa:16:3e:46:9e:e5", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0c4ab2-40", "ovs_interfaceid": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:19:56 compute-0 nova_compute[192716]: 2025-10-07 22:19:56.625 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:19:57 compute-0 nova_compute[192716]: 2025-10-07 22:19:57.156 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:19:57 compute-0 nova_compute[192716]: 2025-10-07 22:19:57.157 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:19:57 compute-0 nova_compute[192716]: 2025-10-07 22:19:57.158 2 DEBUG oslo_concurrency.lockutils [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:19:57 compute-0 nova_compute[192716]: 2025-10-07 22:19:57.165 2 INFO nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:19:57 compute-0 virtqemud[192532]: Domain id=21 name='instance-0000001c' uuid=c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec is tainted: custom-monitor
Oct 07 22:19:57 compute-0 podman[227279]: 2025-10-07 22:19:57.825811005 +0000 UTC m=+0.068243668 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 07 22:19:57 compute-0 podman[227280]: 2025-10-07 22:19:57.825386292 +0000 UTC m=+0.065063834 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:19:57 compute-0 nova_compute[192716]: 2025-10-07 22:19:57.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:19:58 compute-0 nova_compute[192716]: 2025-10-07 22:19:58.025 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:58 compute-0 nova_compute[192716]: 2025-10-07 22:19:58.174 2 INFO nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:19:58 compute-0 nova_compute[192716]: 2025-10-07 22:19:58.535 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:58 compute-0 nova_compute[192716]: 2025-10-07 22:19:58.536 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:58 compute-0 nova_compute[192716]: 2025-10-07 22:19:58.536 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:58 compute-0 nova_compute[192716]: 2025-10-07 22:19:58.536 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:19:59 compute-0 nova_compute[192716]: 2025-10-07 22:19:59.183 2 INFO nova.virt.libvirt.driver [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:19:59 compute-0 nova_compute[192716]: 2025-10-07 22:19:59.188 2 DEBUG nova.compute.manager [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:19:59 compute-0 nova_compute[192716]: 2025-10-07 22:19:59.699 2 DEBUG nova.objects.instance [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:19:59 compute-0 podman[203153]: time="2025-10-07T22:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:19:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:19:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3491 "" "Go-http-client/1.1"
Oct 07 22:20:00 compute-0 nova_compute[192716]: 2025-10-07 22:20:00.721 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:00 compute-0 nova_compute[192716]: 2025-10-07 22:20:00.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: ERROR   22:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: ERROR   22:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: ERROR   22:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: ERROR   22:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: ERROR   22:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:20:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:20:01 compute-0 nova_compute[192716]: 2025-10-07 22:20:01.558 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:01 compute-0 nova_compute[192716]: 2025-10-07 22:20:01.558 2 WARNING neutronclient.v2_0.client [None req-608bdfb1-5d74-433c-8d97-3d438e2222d0 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:01 compute-0 podman[227317]: 2025-10-07 22:20:01.846341689 +0000 UTC m=+0.076010635 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:20:02 compute-0 nova_compute[192716]: 2025-10-07 22:20:02.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:05 compute-0 nova_compute[192716]: 2025-10-07 22:20:05.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:07 compute-0 nova_compute[192716]: 2025-10-07 22:20:07.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:09 compute-0 podman[227342]: 2025-10-07 22:20:09.81926132 +0000 UTC m=+0.055815104 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:20:09 compute-0 podman[227341]: 2025-10-07 22:20:09.894826811 +0000 UTC m=+0.132153957 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:20:10 compute-0 nova_compute[192716]: 2025-10-07 22:20:10.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:11 compute-0 nova_compute[192716]: 2025-10-07 22:20:11.186 2 DEBUG nova.compute.manager [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr458mqn9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c8726889-9168-439f-8940-764b45f2a9f6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:20:12 compute-0 nova_compute[192716]: 2025-10-07 22:20:12.222 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-c8726889-9168-439f-8940-764b45f2a9f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:20:12 compute-0 nova_compute[192716]: 2025-10-07 22:20:12.223 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-c8726889-9168-439f-8940-764b45f2a9f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:20:12 compute-0 nova_compute[192716]: 2025-10-07 22:20:12.223 2 DEBUG nova.network.neutron [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:20:12 compute-0 nova_compute[192716]: 2025-10-07 22:20:12.729 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:12 compute-0 nova_compute[192716]: 2025-10-07 22:20:12.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:14 compute-0 nova_compute[192716]: 2025-10-07 22:20:14.639 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:15 compute-0 nova_compute[192716]: 2025-10-07 22:20:15.587 2 DEBUG nova.network.neutron [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Updating instance_info_cache with network_info: [{"id": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "address": "fa:16:3e:dd:23:81", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6995bd7-e0", "ovs_interfaceid": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:20:15 compute-0 podman[227388]: 2025-10-07 22:20:15.811711414 +0000 UTC m=+0.057398581 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, architecture=x86_64)
Oct 07 22:20:15 compute-0 nova_compute[192716]: 2025-10-07 22:20:15.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.095 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-c8726889-9168-439f-8940-764b45f2a9f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.108 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr458mqn9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c8726889-9168-439f-8940-764b45f2a9f6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.109 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Creating instance directory: /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.109 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Creating disk.info with the contents: {'/var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk': 'qcow2', '/var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.109 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.110 2 DEBUG nova.objects.instance [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c8726889-9168-439f-8940-764b45f2a9f6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.615 2 DEBUG oslo_utils.imageutils.format_inspector [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.620 2 DEBUG oslo_utils.imageutils.format_inspector [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.623 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.673 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.674 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.675 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.676 2 DEBUG oslo_utils.imageutils.format_inspector [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.680 2 DEBUG oslo_utils.imageutils.format_inspector [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.680 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.737 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.738 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.770 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.771 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.771 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.836 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.837 2 DEBUG nova.virt.disk.api [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.837 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.901 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.902 2 DEBUG nova.virt.disk.api [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:20:16 compute-0 nova_compute[192716]: 2025-10-07 22:20:16.902 2 DEBUG nova.objects.instance [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid c8726889-9168-439f-8940-764b45f2a9f6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.412 2 DEBUG nova.objects.base [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<c8726889-9168-439f-8940-764b45f2a9f6> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.413 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.432 2 DEBUG oslo_concurrency.processutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6/disk.config 497664" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.433 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.434 2 DEBUG nova.virt.libvirt.vif [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:18:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-540768949',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-5407689',id=29,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:19:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-9apanns3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:19:00Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=c8726889-9168-439f-8940-764b45f2a9f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "address": "fa:16:3e:dd:23:81", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf6995bd7-e0", "ovs_interfaceid": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.434 2 DEBUG nova.network.os_vif_util [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "address": "fa:16:3e:dd:23:81", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf6995bd7-e0", "ovs_interfaceid": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.435 2 DEBUG nova.network.os_vif_util [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:23:81,bridge_name='br-int',has_traffic_filtering=True,id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6995bd7-e0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.435 2 DEBUG os_vif [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:23:81,bridge_name='br-int',has_traffic_filtering=True,id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6995bd7-e0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.437 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2d367cad-ca1a-505a-8414-b5c6c008583e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6995bd7-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf6995bd7-e0, col_values=(('qos', UUID('74d78e0c-ec21-4859-9af3-84e4d1c680cc')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf6995bd7-e0, col_values=(('external_ids', {'iface-id': 'f6995bd7-e02c-4608-8ad3-9be2589f6bc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:23:81', 'vm-uuid': 'c8726889-9168-439f-8940-764b45f2a9f6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 NetworkManager[51722]: <info>  [1759875617.4458] manager: (tapf6995bd7-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.454 2 INFO os_vif [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:23:81,bridge_name='br-int',has_traffic_filtering=True,id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6995bd7-e0')
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.454 2 DEBUG nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.454 2 DEBUG nova.compute.manager [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr458mqn9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c8726889-9168-439f-8940-764b45f2a9f6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.455 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:17 compute-0 nova_compute[192716]: 2025-10-07 22:20:17.567 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:18 compute-0 nova_compute[192716]: 2025-10-07 22:20:18.440 2 DEBUG nova.network.neutron [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Port f6995bd7-e02c-4608-8ad3-9be2589f6bc7 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:20:18 compute-0 nova_compute[192716]: 2025-10-07 22:20:18.449 2 DEBUG nova.compute.manager [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr458mqn9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c8726889-9168-439f-8940-764b45f2a9f6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:20:20 compute-0 ovn_controller[94904]: 2025-10-07T22:20:20Z|00250|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 07 22:20:20 compute-0 nova_compute[192716]: 2025-10-07 22:20:20.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:21 compute-0 kernel: tapf6995bd7-e0: entered promiscuous mode
Oct 07 22:20:21 compute-0 NetworkManager[51722]: <info>  [1759875621.7147] manager: (tapf6995bd7-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Oct 07 22:20:21 compute-0 nova_compute[192716]: 2025-10-07 22:20:21.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:21 compute-0 ovn_controller[94904]: 2025-10-07T22:20:21Z|00251|binding|INFO|Claiming lport f6995bd7-e02c-4608-8ad3-9be2589f6bc7 for this additional chassis.
Oct 07 22:20:21 compute-0 ovn_controller[94904]: 2025-10-07T22:20:21Z|00252|binding|INFO|f6995bd7-e02c-4608-8ad3-9be2589f6bc7: Claiming fa:16:3e:dd:23:81 10.100.0.7
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.727 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:23:81 10.100.0.7'], port_security=['fa:16:3e:dd:23:81 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c8726889-9168-439f-8940-764b45f2a9f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=f6995bd7-e02c-4608-8ad3-9be2589f6bc7) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.728 103791 INFO neutron.agent.ovn.metadata.agent [-] Port f6995bd7-e02c-4608-8ad3-9be2589f6bc7 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.730 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:20:21 compute-0 nova_compute[192716]: 2025-10-07 22:20:21.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:21 compute-0 ovn_controller[94904]: 2025-10-07T22:20:21Z|00253|binding|INFO|Setting lport f6995bd7-e02c-4608-8ad3-9be2589f6bc7 ovn-installed in OVS
Oct 07 22:20:21 compute-0 nova_compute[192716]: 2025-10-07 22:20:21.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.752 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[36718cae-bc64-429d-a4e7-d94c14c08d39]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:21 compute-0 systemd-udevd[227442]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:20:21 compute-0 systemd-machined[152719]: New machine qemu-22-instance-0000001d.
Oct 07 22:20:21 compute-0 NetworkManager[51722]: <info>  [1759875621.7864] device (tapf6995bd7-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:20:21 compute-0 NetworkManager[51722]: <info>  [1759875621.7879] device (tapf6995bd7-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:20:21 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.807 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[5b549d74-4fb0-4511-a584-78cccba07b32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.812 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9266e0ce-d27c-4a0e-9b80-63a856c8b25f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.863 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[e975f5c1-149f-404e-8128-7ab0fe84a715]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.897 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[20548889-7a70-4f57-a0f7-acdf6ac2906f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532161, 'reachable_time': 21415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227457, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.915 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[89c3a9c3-205f-4e62-8eab-a82690a891fa]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532178, 'tstamp': 532178}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227459, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532182, 'tstamp': 532182}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227459, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.917 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:21 compute-0 nova_compute[192716]: 2025-10-07 22:20:21.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:21 compute-0 nova_compute[192716]: 2025-10-07 22:20:21.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.922 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe7be80-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.923 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.924 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafe7be80-c0, col_values=(('external_ids', {'iface-id': 'b656ca07-6e70-4919-b525-077e26d9c217'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.926 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:20:21 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:21.929 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7254fa84-5ed0-4f04-9272-5e7a1012a264]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-afe7be80-c16b-4cef-89c4-8851641c6faf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID afe7be80-c16b-4cef-89c4-8851641c6faf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:22 compute-0 nova_compute[192716]: 2025-10-07 22:20:22.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:25.660 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:25.661 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:25.662 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:25 compute-0 ovn_controller[94904]: 2025-10-07T22:20:25Z|00254|binding|INFO|Claiming lport f6995bd7-e02c-4608-8ad3-9be2589f6bc7 for this chassis.
Oct 07 22:20:25 compute-0 ovn_controller[94904]: 2025-10-07T22:20:25Z|00255|binding|INFO|f6995bd7-e02c-4608-8ad3-9be2589f6bc7: Claiming fa:16:3e:dd:23:81 10.100.0.7
Oct 07 22:20:25 compute-0 ovn_controller[94904]: 2025-10-07T22:20:25Z|00256|binding|INFO|Setting lport f6995bd7-e02c-4608-8ad3-9be2589f6bc7 up in Southbound
Oct 07 22:20:25 compute-0 nova_compute[192716]: 2025-10-07 22:20:25.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:26 compute-0 nova_compute[192716]: 2025-10-07 22:20:26.936 2 INFO nova.compute.manager [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Post operation of migration started
Oct 07 22:20:26 compute-0 nova_compute[192716]: 2025-10-07 22:20:26.937 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:27 compute-0 nova_compute[192716]: 2025-10-07 22:20:27.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:27 compute-0 nova_compute[192716]: 2025-10-07 22:20:27.565 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:27 compute-0 nova_compute[192716]: 2025-10-07 22:20:27.566 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:27 compute-0 nova_compute[192716]: 2025-10-07 22:20:27.682 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-c8726889-9168-439f-8940-764b45f2a9f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:20:27 compute-0 nova_compute[192716]: 2025-10-07 22:20:27.683 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-c8726889-9168-439f-8940-764b45f2a9f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:20:27 compute-0 nova_compute[192716]: 2025-10-07 22:20:27.683 2 DEBUG nova.network.neutron [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:20:28 compute-0 nova_compute[192716]: 2025-10-07 22:20:28.190 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:28 compute-0 podman[227480]: 2025-10-07 22:20:28.854698366 +0000 UTC m=+0.080866657 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:20:28 compute-0 podman[227479]: 2025-10-07 22:20:28.859793545 +0000 UTC m=+0.087811540 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid)
Oct 07 22:20:28 compute-0 nova_compute[192716]: 2025-10-07 22:20:28.970 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:29 compute-0 nova_compute[192716]: 2025-10-07 22:20:29.299 2 DEBUG nova.network.neutron [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Updating instance_info_cache with network_info: [{"id": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "address": "fa:16:3e:dd:23:81", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6995bd7-e0", "ovs_interfaceid": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:20:29 compute-0 podman[203153]: time="2025-10-07T22:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:20:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:20:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3495 "" "Go-http-client/1.1"
Oct 07 22:20:29 compute-0 nova_compute[192716]: 2025-10-07 22:20:29.807 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-c8726889-9168-439f-8940-764b45f2a9f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:20:30 compute-0 nova_compute[192716]: 2025-10-07 22:20:30.325 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:30 compute-0 nova_compute[192716]: 2025-10-07 22:20:30.325 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:30 compute-0 nova_compute[192716]: 2025-10-07 22:20:30.326 2 DEBUG oslo_concurrency.lockutils [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:30 compute-0 nova_compute[192716]: 2025-10-07 22:20:30.333 2 INFO nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:20:30 compute-0 virtqemud[192532]: Domain id=22 name='instance-0000001d' uuid=c8726889-9168-439f-8940-764b45f2a9f6 is tainted: custom-monitor
Oct 07 22:20:30 compute-0 nova_compute[192716]: 2025-10-07 22:20:30.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:31 compute-0 nova_compute[192716]: 2025-10-07 22:20:31.339 2 INFO nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: ERROR   22:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: ERROR   22:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: ERROR   22:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: ERROR   22:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: ERROR   22:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:20:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:20:32 compute-0 nova_compute[192716]: 2025-10-07 22:20:32.345 2 INFO nova.virt.libvirt.driver [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:20:32 compute-0 nova_compute[192716]: 2025-10-07 22:20:32.352 2 DEBUG nova.compute.manager [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:20:32 compute-0 nova_compute[192716]: 2025-10-07 22:20:32.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:32 compute-0 podman[227519]: 2025-10-07 22:20:32.824365933 +0000 UTC m=+0.065756525 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:20:32 compute-0 nova_compute[192716]: 2025-10-07 22:20:32.864 2 DEBUG nova.objects.instance [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:20:33 compute-0 nova_compute[192716]: 2025-10-07 22:20:33.886 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:34 compute-0 nova_compute[192716]: 2025-10-07 22:20:34.583 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:34 compute-0 nova_compute[192716]: 2025-10-07 22:20:34.584 2 WARNING neutronclient.v2_0.client [None req-15c86892-f9d9-48b1-8ef7-5a7465dca62f 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:35 compute-0 nova_compute[192716]: 2025-10-07 22:20:35.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.629 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "c8726889-9168-439f-8940-764b45f2a9f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.629 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.630 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "c8726889-9168-439f-8940-764b45f2a9f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.630 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.631 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:37 compute-0 nova_compute[192716]: 2025-10-07 22:20:37.644 2 INFO nova.compute.manager [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Terminating instance
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.161 2 DEBUG nova.compute.manager [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:20:38 compute-0 kernel: tapf6995bd7-e0 (unregistering): left promiscuous mode
Oct 07 22:20:38 compute-0 NetworkManager[51722]: <info>  [1759875638.1819] device (tapf6995bd7-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:20:38 compute-0 ovn_controller[94904]: 2025-10-07T22:20:38Z|00257|binding|INFO|Releasing lport f6995bd7-e02c-4608-8ad3-9be2589f6bc7 from this chassis (sb_readonly=0)
Oct 07 22:20:38 compute-0 ovn_controller[94904]: 2025-10-07T22:20:38Z|00258|binding|INFO|Setting lport f6995bd7-e02c-4608-8ad3-9be2589f6bc7 down in Southbound
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 ovn_controller[94904]: 2025-10-07T22:20:38Z|00259|binding|INFO|Removing iface tapf6995bd7-e0 ovn-installed in OVS
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.205 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:23:81 10.100.0.7'], port_security=['fa:16:3e:dd:23:81 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c8726889-9168-439f-8940-764b45f2a9f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '14', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=f6995bd7-e02c-4608-8ad3-9be2589f6bc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.206 103791 INFO neutron.agent.ovn.metadata.agent [-] Port f6995bd7-e02c-4608-8ad3-9be2589f6bc7 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.207 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afe7be80-c16b-4cef-89c4-8851641c6faf
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.224 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ed84e232-288e-4a09-978a-7b9a25f238b4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct 07 22:20:38 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 2.118s CPU time.
Oct 07 22:20:38 compute-0 systemd-machined[152719]: Machine qemu-22-instance-0000001d terminated.
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.276 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[2799a9f4-e9ec-4a38-85f5-527aecbf66bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.281 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[52f6ce8b-59d7-41d7-a27a-55204717ffe8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.323 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[81c348d4-5567-4be3-851f-bbcd48b90241]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.343 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[89a7faaa-150e-4735-aed7-e47d4763896f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafe7be80-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:b4:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532161, 'reachable_time': 21415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227556, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.367 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d48a603f-d842-4e0a-8993-f67783c7eeee]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532178, 'tstamp': 532178}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227557, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapafe7be80-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532182, 'tstamp': 532182}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227557, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.368 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.384 2 DEBUG nova.compute.manager [req-8c622b37-625b-4b2d-843d-9ce7e4c284d4 req-061581e3-ac78-43fa-8636-32fb8be79419 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Received event network-vif-unplugged-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.385 2 DEBUG oslo_concurrency.lockutils [req-8c622b37-625b-4b2d-843d-9ce7e4c284d4 req-061581e3-ac78-43fa-8636-32fb8be79419 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c8726889-9168-439f-8940-764b45f2a9f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.385 2 DEBUG oslo_concurrency.lockutils [req-8c622b37-625b-4b2d-843d-9ce7e4c284d4 req-061581e3-ac78-43fa-8636-32fb8be79419 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.385 2 DEBUG oslo_concurrency.lockutils [req-8c622b37-625b-4b2d-843d-9ce7e4c284d4 req-061581e3-ac78-43fa-8636-32fb8be79419 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.385 2 DEBUG nova.compute.manager [req-8c622b37-625b-4b2d-843d-9ce7e4c284d4 req-061581e3-ac78-43fa-8636-32fb8be79419 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] No waiting events found dispatching network-vif-unplugged-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.385 2 DEBUG nova.compute.manager [req-8c622b37-625b-4b2d-843d-9ce7e4c284d4 req-061581e3-ac78-43fa-8636-32fb8be79419 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Received event network-vif-unplugged-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:20:38 compute-0 NetworkManager[51722]: <info>  [1759875638.4149] manager: (tapf6995bd7-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.422 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe7be80-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.422 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.422 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafe7be80-c0, col_values=(('external_ids', {'iface-id': 'b656ca07-6e70-4919-b525-077e26d9c217'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.423 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:20:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:38.424 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[94fc973f-a2f9-454f-9e71-6a80600779b6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-afe7be80-c16b-4cef-89c4-8851641c6faf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID afe7be80-c16b-4cef-89c4-8851641c6faf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.469 2 INFO nova.virt.libvirt.driver [-] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Instance destroyed successfully.
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.470 2 DEBUG nova.objects.instance [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lazy-loading 'resources' on Instance uuid c8726889-9168-439f-8940-764b45f2a9f6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.976 2 DEBUG nova.virt.libvirt.vif [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:18:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-540768949',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-5407689',id=29,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:19:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-9apanns3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:20:33Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=c8726889-9168-439f-8940-764b45f2a9f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "address": "fa:16:3e:dd:23:81", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6995bd7-e0", "ovs_interfaceid": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.977 2 DEBUG nova.network.os_vif_util [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converting VIF {"id": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "address": "fa:16:3e:dd:23:81", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6995bd7-e0", "ovs_interfaceid": "f6995bd7-e02c-4608-8ad3-9be2589f6bc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.978 2 DEBUG nova.network.os_vif_util [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:23:81,bridge_name='br-int',has_traffic_filtering=True,id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6995bd7-e0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.978 2 DEBUG os_vif [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:23:81,bridge_name='br-int',has_traffic_filtering=True,id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6995bd7-e0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6995bd7-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.985 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=74d78e0c-ec21-4859-9af3-84e4d1c680cc) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.989 2 INFO os_vif [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:23:81,bridge_name='br-int',has_traffic_filtering=True,id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6995bd7-e0')
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.990 2 INFO nova.virt.libvirt.driver [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Deleting instance files /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6_del
Oct 07 22:20:38 compute-0 nova_compute[192716]: 2025-10-07 22:20:38.991 2 INFO nova.virt.libvirt.driver [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Deletion of /var/lib/nova/instances/c8726889-9168-439f-8940-764b45f2a9f6_del complete
Oct 07 22:20:39 compute-0 nova_compute[192716]: 2025-10-07 22:20:39.502 2 INFO nova.compute.manager [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Took 1.34 seconds to destroy the instance on the hypervisor.
Oct 07 22:20:39 compute-0 nova_compute[192716]: 2025-10-07 22:20:39.503 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:20:39 compute-0 nova_compute[192716]: 2025-10-07 22:20:39.504 2 DEBUG nova.compute.manager [-] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:20:39 compute-0 nova_compute[192716]: 2025-10-07 22:20:39.504 2 DEBUG nova.network.neutron [-] [instance: c8726889-9168-439f-8940-764b45f2a9f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:20:39 compute-0 nova_compute[192716]: 2025-10-07 22:20:39.504 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:39 compute-0 nova_compute[192716]: 2025-10-07 22:20:39.886 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.264 2 DEBUG nova.compute.manager [req-bab7df0b-ea6d-48c3-9c9f-651b70533095 req-b3a2da6b-b829-439c-8b71-71c1b1c6aebc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Received event network-vif-deleted-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.265 2 INFO nova.compute.manager [req-bab7df0b-ea6d-48c3-9c9f-651b70533095 req-b3a2da6b-b829-439c-8b71-71c1b1c6aebc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Neutron deleted interface f6995bd7-e02c-4608-8ad3-9be2589f6bc7; detaching it from the instance and deleting it from the info cache
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.265 2 DEBUG nova.network.neutron [req-bab7df0b-ea6d-48c3-9c9f-651b70533095 req-b3a2da6b-b829-439c-8b71-71c1b1c6aebc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.442 2 DEBUG nova.compute.manager [req-74ffdfd3-3ef6-40fc-9b64-f37d76fa929a req-7278e2e1-6693-49c1-a49e-c64b4f95a2b0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Received event network-vif-unplugged-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.442 2 DEBUG oslo_concurrency.lockutils [req-74ffdfd3-3ef6-40fc-9b64-f37d76fa929a req-7278e2e1-6693-49c1-a49e-c64b4f95a2b0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c8726889-9168-439f-8940-764b45f2a9f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.442 2 DEBUG oslo_concurrency.lockutils [req-74ffdfd3-3ef6-40fc-9b64-f37d76fa929a req-7278e2e1-6693-49c1-a49e-c64b4f95a2b0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.443 2 DEBUG oslo_concurrency.lockutils [req-74ffdfd3-3ef6-40fc-9b64-f37d76fa929a req-7278e2e1-6693-49c1-a49e-c64b4f95a2b0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.443 2 DEBUG nova.compute.manager [req-74ffdfd3-3ef6-40fc-9b64-f37d76fa929a req-7278e2e1-6693-49c1-a49e-c64b4f95a2b0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] No waiting events found dispatching network-vif-unplugged-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.443 2 DEBUG nova.compute.manager [req-74ffdfd3-3ef6-40fc-9b64-f37d76fa929a req-7278e2e1-6693-49c1-a49e-c64b4f95a2b0 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Received event network-vif-unplugged-f6995bd7-e02c-4608-8ad3-9be2589f6bc7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.705 2 DEBUG nova.network.neutron [-] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.774 2 DEBUG nova.compute.manager [req-bab7df0b-ea6d-48c3-9c9f-651b70533095 req-b3a2da6b-b829-439c-8b71-71c1b1c6aebc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Detach interface failed, port_id=f6995bd7-e02c-4608-8ad3-9be2589f6bc7, reason: Instance c8726889-9168-439f-8940-764b45f2a9f6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:20:40 compute-0 podman[227574]: 2025-10-07 22:20:40.842037897 +0000 UTC m=+0.068307570 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007)
Oct 07 22:20:40 compute-0 podman[227573]: 2025-10-07 22:20:40.882180831 +0000 UTC m=+0.118746405 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:20:40 compute-0 nova_compute[192716]: 2025-10-07 22:20:40.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.214 2 INFO nova.compute.manager [-] [instance: c8726889-9168-439f-8940-764b45f2a9f6] Took 1.71 seconds to deallocate network for instance.
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.742 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.743 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.749 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.787 2 INFO nova.scheduler.client.report [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Deleted allocations for instance c8726889-9168-439f-8940-764b45f2a9f6
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:41 compute-0 nova_compute[192716]: 2025-10-07 22:20:41.992 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:20:42 compute-0 nova_compute[192716]: 2025-10-07 22:20:42.816 2 DEBUG oslo_concurrency.lockutils [None req-debb1268-ee3b-49a8-a3c4-3665e599b6da a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c8726889-9168-439f-8940-764b45f2a9f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.187s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.314 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.315 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.315 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.315 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.316 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.330 2 INFO nova.compute.manager [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Terminating instance
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.850 2 DEBUG nova.compute.manager [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:20:43 compute-0 kernel: tapda0c4ab2-40 (unregistering): left promiscuous mode
Oct 07 22:20:43 compute-0 NetworkManager[51722]: <info>  [1759875643.8774] device (tapda0c4ab2-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:43 compute-0 ovn_controller[94904]: 2025-10-07T22:20:43Z|00260|binding|INFO|Releasing lport da0c4ab2-4043-426e-a322-76ffcf2e1751 from this chassis (sb_readonly=0)
Oct 07 22:20:43 compute-0 ovn_controller[94904]: 2025-10-07T22:20:43Z|00261|binding|INFO|Setting lport da0c4ab2-4043-426e-a322-76ffcf2e1751 down in Southbound
Oct 07 22:20:43 compute-0 ovn_controller[94904]: 2025-10-07T22:20:43Z|00262|binding|INFO|Removing iface tapda0c4ab2-40 ovn-installed in OVS
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:43.895 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:9e:e5 10.100.0.10'], port_security=['fa:16:3e:46:9e:e5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afe7be80-c16b-4cef-89c4-8851641c6faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4cb01004a26f472187e01e5d3a57f84a', 'neutron:revision_number': '15', 'neutron:security_group_ids': '93dab7df-ccdf-44ad-a320-72fe683eb516', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79a90f3c-820c-43b7-a388-8b7a51286af4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=da0c4ab2-4043-426e-a322-76ffcf2e1751) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:20:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:43.896 103791 INFO neutron.agent.ovn.metadata.agent [-] Port da0c4ab2-4043-426e-a322-76ffcf2e1751 in datapath afe7be80-c16b-4cef-89c4-8851641c6faf unbound from our chassis
Oct 07 22:20:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:43.897 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network afe7be80-c16b-4cef-89c4-8851641c6faf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:20:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:43.898 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3c98b1-a7b1-4c5c-9710-06fe4d4a6e62]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:43.899 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf namespace which is not needed anymore
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:43 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct 07 22:20:43 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Consumed 3.797s CPU time.
Oct 07 22:20:43 compute-0 systemd-machined[152719]: Machine qemu-21-instance-0000001c terminated.
Oct 07 22:20:43 compute-0 nova_compute[192716]: 2025-10-07 22:20:43.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [NOTICE]   (227246) : haproxy version is 3.0.5-8e879a5
Oct 07 22:20:44 compute-0 podman[227641]: 2025-10-07 22:20:44.016613331 +0000 UTC m=+0.030915775 container kill 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 22:20:44 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [NOTICE]   (227246) : path to executable is /usr/sbin/haproxy
Oct 07 22:20:44 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [WARNING]  (227246) : Exiting Master process...
Oct 07 22:20:44 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [ALERT]    (227246) : Current worker (227248) exited with code 143 (Terminated)
Oct 07 22:20:44 compute-0 neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf[227242]: [WARNING]  (227246) : All workers exited. Exiting... (0)
Oct 07 22:20:44 compute-0 systemd[1]: libpod-61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873.scope: Deactivated successfully.
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.054 2 DEBUG nova.compute.manager [req-4badf45c-79cf-4aad-9ff2-9bbc2e1ba1de req-5f0d0609-fa44-49e1-b31f-61aad7ff6f29 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Received event network-vif-unplugged-da0c4ab2-4043-426e-a322-76ffcf2e1751 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.055 2 DEBUG oslo_concurrency.lockutils [req-4badf45c-79cf-4aad-9ff2-9bbc2e1ba1de req-5f0d0609-fa44-49e1-b31f-61aad7ff6f29 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.055 2 DEBUG oslo_concurrency.lockutils [req-4badf45c-79cf-4aad-9ff2-9bbc2e1ba1de req-5f0d0609-fa44-49e1-b31f-61aad7ff6f29 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.055 2 DEBUG oslo_concurrency.lockutils [req-4badf45c-79cf-4aad-9ff2-9bbc2e1ba1de req-5f0d0609-fa44-49e1-b31f-61aad7ff6f29 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.056 2 DEBUG nova.compute.manager [req-4badf45c-79cf-4aad-9ff2-9bbc2e1ba1de req-5f0d0609-fa44-49e1-b31f-61aad7ff6f29 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] No waiting events found dispatching network-vif-unplugged-da0c4ab2-4043-426e-a322-76ffcf2e1751 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.056 2 DEBUG nova.compute.manager [req-4badf45c-79cf-4aad-9ff2-9bbc2e1ba1de req-5f0d0609-fa44-49e1-b31f-61aad7ff6f29 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Received event network-vif-unplugged-da0c4ab2-4043-426e-a322-76ffcf2e1751 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:20:44 compute-0 podman[227657]: 2025-10-07 22:20:44.066108179 +0000 UTC m=+0.027158035 container died 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:20:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873-userdata-shm.mount: Deactivated successfully.
Oct 07 22:20:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-c5ada2d84d1e147d07a035c0874a6255b21503c6f560a20499898ae4d1616faa-merged.mount: Deactivated successfully.
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.121 2 INFO nova.virt.libvirt.driver [-] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Instance destroyed successfully.
Oct 07 22:20:44 compute-0 podman[227657]: 2025-10-07 22:20:44.122779267 +0000 UTC m=+0.083829103 container cleanup 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.123 2 DEBUG nova.objects.instance [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lazy-loading 'resources' on Instance uuid c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:20:44 compute-0 systemd[1]: libpod-conmon-61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873.scope: Deactivated successfully.
Oct 07 22:20:44 compute-0 podman[227658]: 2025-10-07 22:20:44.139652461 +0000 UTC m=+0.088563842 container remove 61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.145 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[68c429ff-f1d5-4066-b159-092bbff9e0aa]: (4, ("Tue Oct  7 10:20:43 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf (61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873)\n61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873\nTue Oct  7 10:20:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf (61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873)\n61aee5663c3bc0d84b94bf068dd6d11e4f035c29e97cb67760b424fe87aed873\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.146 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[21b08545-adb0-4e82-a05b-937a6e753db0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.146 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afe7be80-c16b-4cef-89c4-8851641c6faf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.147 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[65149c73-76fc-4d95-9593-45db59508718]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.147 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe7be80-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 kernel: tapafe7be80-c0: left promiscuous mode
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.172 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[9c71e8fa-d12d-43b4-99a3-e5a2eefc00e5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.203 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e960e443-f696-4f74-a200-61614521283e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.205 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[870afcb2-478d-4a9e-87b7-29665bbef069]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.224 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[335778e0-02e7-42d7-9bd8-3537a9a1c1c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532151, 'reachable_time': 18714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227703, 'error': None, 'target': 'ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.227 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-afe7be80-c16b-4cef-89c4-8851641c6faf deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.228 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[525cf589-b764-476b-9dec-7c4f33529284]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:20:44 compute-0 systemd[1]: run-netns-ovnmeta\x2dafe7be80\x2dc16b\x2d4cef\x2d89c4\x2d8851641c6faf.mount: Deactivated successfully.
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.628 2 DEBUG nova.virt.libvirt.vif [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-265958160',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-2659581',id=28,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:18:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4cb01004a26f472187e01e5d3a57f84a',ramdisk_id='',reservation_id='r-z0jsmwo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-866189760-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:20:00Z,user_data=None,user_id='a0c373c3cf7242d4af22e259b5a27a6b',uuid=c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "address": "fa:16:3e:46:9e:e5", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0c4ab2-40", "ovs_interfaceid": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.629 2 DEBUG nova.network.os_vif_util [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converting VIF {"id": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "address": "fa:16:3e:46:9e:e5", "network": {"id": "afe7be80-c16b-4cef-89c4-8851641c6faf", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-895931661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f18b68e1837842c293e8b6a621641354", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0c4ab2-40", "ovs_interfaceid": "da0c4ab2-4043-426e-a322-76ffcf2e1751", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.629 2 DEBUG nova.network.os_vif_util [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:9e:e5,bridge_name='br-int',has_traffic_filtering=True,id=da0c4ab2-4043-426e-a322-76ffcf2e1751,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0c4ab2-40') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.630 2 DEBUG os_vif [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:9e:e5,bridge_name='br-int',has_traffic_filtering=True,id=da0c4ab2-4043-426e-a322-76ffcf2e1751,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0c4ab2-40') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.631 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda0c4ab2-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=92d8c085-ca32-4a97-899c-e6bcd925f2d3) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.639 2 INFO os_vif [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:9e:e5,bridge_name='br-int',has_traffic_filtering=True,id=da0c4ab2-4043-426e-a322-76ffcf2e1751,network=Network(afe7be80-c16b-4cef-89c4-8851641c6faf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0c4ab2-40')
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.640 2 INFO nova.virt.libvirt.driver [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Deleting instance files /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec_del
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.640 2 INFO nova.virt.libvirt.driver [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Deletion of /var/lib/nova/instances/c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec_del complete
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.670 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:20:44 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:44.671 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:20:44 compute-0 nova_compute[192716]: 2025-10-07 22:20:44.992 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.153 2 INFO nova.compute.manager [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Took 1.30 seconds to destroy the instance on the hypervisor.
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.153 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.154 2 DEBUG nova.compute.manager [-] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.154 2 DEBUG nova.network.neutron [-] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.154 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.568 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:20:45 compute-0 nova_compute[192716]: 2025-10-07 22:20:45.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.132 2 DEBUG nova.compute.manager [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Received event network-vif-unplugged-da0c4ab2-4043-426e-a322-76ffcf2e1751 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.132 2 DEBUG oslo_concurrency.lockutils [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.133 2 DEBUG oslo_concurrency.lockutils [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.133 2 DEBUG oslo_concurrency.lockutils [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.133 2 DEBUG nova.compute.manager [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] No waiting events found dispatching network-vif-unplugged-da0c4ab2-4043-426e-a322-76ffcf2e1751 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.134 2 DEBUG nova.compute.manager [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Received event network-vif-unplugged-da0c4ab2-4043-426e-a322-76ffcf2e1751 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.134 2 DEBUG nova.compute.manager [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Received event network-vif-deleted-da0c4ab2-4043-426e-a322-76ffcf2e1751 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.134 2 INFO nova.compute.manager [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Neutron deleted interface da0c4ab2-4043-426e-a322-76ffcf2e1751; detaching it from the instance and deleting it from the info cache
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.134 2 DEBUG nova.network.neutron [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.359 2 DEBUG nova.network.neutron [-] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.644 2 DEBUG nova.compute.manager [req-9030fb06-e6b1-4321-949a-4ad4ccfa747a req-6b5a16d9-51cc-418f-8524-8c413df4ae48 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Detach interface failed, port_id=da0c4ab2-4043-426e-a322-76ffcf2e1751, reason: Instance c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:20:46 compute-0 podman[227705]: 2025-10-07 22:20:46.835242242 +0000 UTC m=+0.070935656 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Oct 07 22:20:46 compute-0 nova_compute[192716]: 2025-10-07 22:20:46.870 2 INFO nova.compute.manager [-] [instance: c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec] Took 1.72 seconds to deallocate network for instance.
Oct 07 22:20:47 compute-0 nova_compute[192716]: 2025-10-07 22:20:47.391 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:47 compute-0 nova_compute[192716]: 2025-10-07 22:20:47.392 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:47 compute-0 nova_compute[192716]: 2025-10-07 22:20:47.399 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:47 compute-0 nova_compute[192716]: 2025-10-07 22:20:47.425 2 INFO nova.scheduler.client.report [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Deleted allocations for instance c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec
Oct 07 22:20:47 compute-0 nova_compute[192716]: 2025-10-07 22:20:47.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:47 compute-0 nova_compute[192716]: 2025-10-07 22:20:47.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.470 2 DEBUG oslo_concurrency.lockutils [None req-91c852a3-5c3a-48e3-adc7-4e6f063d7fed a0c373c3cf7242d4af22e259b5a27a6b 4cb01004a26f472187e01e5d3a57f84a - - default default] Lock "c2f0dd04-cc1b-4f5d-8d0a-ce24f88173ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.508 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.508 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.508 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.738 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.739 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.763 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.763 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5804MB free_disk=73.29894638061523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.764 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:20:48 compute-0 nova_compute[192716]: 2025-10-07 22:20:48.764 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:20:49 compute-0 nova_compute[192716]: 2025-10-07 22:20:49.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:49 compute-0 nova_compute[192716]: 2025-10-07 22:20:49.807 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:20:49 compute-0 nova_compute[192716]: 2025-10-07 22:20:49.807 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:20:48 up  1:29,  0 user,  load average: 0.06, 0.09, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:20:49 compute-0 nova_compute[192716]: 2025-10-07 22:20:49.837 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:20:50 compute-0 nova_compute[192716]: 2025-10-07 22:20:50.345 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:20:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:20:50.672 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:20:50 compute-0 nova_compute[192716]: 2025-10-07 22:20:50.854 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:20:50 compute-0 nova_compute[192716]: 2025-10-07 22:20:50.854 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:20:50 compute-0 nova_compute[192716]: 2025-10-07 22:20:50.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:53 compute-0 nova_compute[192716]: 2025-10-07 22:20:53.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:54 compute-0 nova_compute[192716]: 2025-10-07 22:20:54.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:54 compute-0 nova_compute[192716]: 2025-10-07 22:20:54.856 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:55 compute-0 nova_compute[192716]: 2025-10-07 22:20:55.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:55 compute-0 nova_compute[192716]: 2025-10-07 22:20:55.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:58 compute-0 nova_compute[192716]: 2025-10-07 22:20:58.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:58 compute-0 nova_compute[192716]: 2025-10-07 22:20:58.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:20:59 compute-0 nova_compute[192716]: 2025-10-07 22:20:59.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:20:59 compute-0 podman[203153]: time="2025-10-07T22:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:20:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:20:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:20:59 compute-0 podman[227730]: 2025-10-07 22:20:59.83477429 +0000 UTC m=+0.070983177 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 22:20:59 compute-0 podman[227731]: 2025-10-07 22:20:59.872306538 +0000 UTC m=+0.093301300 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 22:21:00 compute-0 nova_compute[192716]: 2025-10-07 22:21:00.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: ERROR   22:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: ERROR   22:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: ERROR   22:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: ERROR   22:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: ERROR   22:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:21:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:21:03 compute-0 podman[227771]: 2025-10-07 22:21:03.857492037 +0000 UTC m=+0.087431789 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:21:04 compute-0 nova_compute[192716]: 2025-10-07 22:21:04.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:05.120 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:0b:74 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f08d1843f22b495a995f11f0b1c90ace', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e008ff2-8b66-4169-982c-1129d9c90df2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ffdd001d-1d41-425c-92ed-0208c7cec9cd) old=Port_Binding(mac=['fa:16:3e:61:0b:74'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f08d1843f22b495a995f11f0b1c90ace', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:21:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:05.121 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ffdd001d-1d41-425c-92ed-0208c7cec9cd in datapath 8558ef14-bbda-4677-87c6-5cb9edc8eae5 updated
Oct 07 22:21:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:05.123 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8558ef14-bbda-4677-87c6-5cb9edc8eae5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:21:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:05.124 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8316fc-9150-4c2c-b471-b2e965aad636]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:21:05 compute-0 nova_compute[192716]: 2025-10-07 22:21:05.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.991 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.992 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.993 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.993 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.994 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:09 compute-0 nova_compute[192716]: 2025-10-07 22:21:09.995 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:10 compute-0 nova_compute[192716]: 2025-10-07 22:21:10.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.009 2 DEBUG nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.010 2 WARNING nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.010 2 INFO nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Removable base files: /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.011 2 INFO nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.011 2 DEBUG nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.012 2 DEBUG nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Oct 07 22:21:11 compute-0 nova_compute[192716]: 2025-10-07 22:21:11.012 2 DEBUG nova.virt.libvirt.imagecache [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Oct 07 22:21:11 compute-0 podman[227796]: 2025-10-07 22:21:11.83516987 +0000 UTC m=+0.066955810 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 07 22:21:11 compute-0 podman[227795]: 2025-10-07 22:21:11.8809648 +0000 UTC m=+0.116616753 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 22:21:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:13.565 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:2b:45 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3e21ae1d-2347-4eb8-ae84-628ee071af42', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e21ae1d-2347-4eb8-ae84-628ee071af42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4afdd5a94b604f91a0cbdb5a281ca0c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2799d332-3c5c-440e-a368-9d532e5d6347, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=07f2f4ca-edb7-46af-941b-908bed70dabf) old=Port_Binding(mac=['fa:16:3e:ec:2b:45'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3e21ae1d-2347-4eb8-ae84-628ee071af42', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e21ae1d-2347-4eb8-ae84-628ee071af42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4afdd5a94b604f91a0cbdb5a281ca0c6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:21:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:13.566 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 07f2f4ca-edb7-46af-941b-908bed70dabf in datapath 3e21ae1d-2347-4eb8-ae84-628ee071af42 updated
Oct 07 22:21:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:13.567 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e21ae1d-2347-4eb8-ae84-628ee071af42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:21:13 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:13.568 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d16ab297-974b-4790-947f-65bcca615a62]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:21:14 compute-0 nova_compute[192716]: 2025-10-07 22:21:14.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:16 compute-0 nova_compute[192716]: 2025-10-07 22:21:16.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:17 compute-0 podman[227837]: 2025-10-07 22:21:17.857466186 +0000 UTC m=+0.085379939 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Oct 07 22:21:19 compute-0 nova_compute[192716]: 2025-10-07 22:21:19.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:21 compute-0 nova_compute[192716]: 2025-10-07 22:21:21.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:24 compute-0 nova_compute[192716]: 2025-10-07 22:21:24.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:25.663 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:25.663 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:25.663 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:26 compute-0 nova_compute[192716]: 2025-10-07 22:21:26.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:27 compute-0 ovn_controller[94904]: 2025-10-07T22:21:27Z|00263|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 07 22:21:29 compute-0 nova_compute[192716]: 2025-10-07 22:21:29.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:29 compute-0 podman[203153]: time="2025-10-07T22:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:21:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:21:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 07 22:21:30 compute-0 podman[227861]: 2025-10-07 22:21:30.811621688 +0000 UTC m=+0.051820817 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=iscsid)
Oct 07 22:21:30 compute-0 podman[227862]: 2025-10-07 22:21:30.84897048 +0000 UTC m=+0.076649783 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:21:31 compute-0 nova_compute[192716]: 2025-10-07 22:21:31.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: ERROR   22:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: ERROR   22:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: ERROR   22:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: ERROR   22:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: ERROR   22:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:21:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:21:34 compute-0 nova_compute[192716]: 2025-10-07 22:21:34.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:34 compute-0 podman[227903]: 2025-10-07 22:21:34.834578678 +0000 UTC m=+0.069931053 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:21:34 compute-0 nova_compute[192716]: 2025-10-07 22:21:34.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:36 compute-0 nova_compute[192716]: 2025-10-07 22:21:36.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:39 compute-0 nova_compute[192716]: 2025-10-07 22:21:39.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:41 compute-0 nova_compute[192716]: 2025-10-07 22:21:41.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:42 compute-0 nova_compute[192716]: 2025-10-07 22:21:42.497 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:42 compute-0 nova_compute[192716]: 2025-10-07 22:21:42.497 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:21:42 compute-0 podman[227928]: 2025-10-07 22:21:42.827324551 +0000 UTC m=+0.061199081 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 07 22:21:42 compute-0 podman[227927]: 2025-10-07 22:21:42.906000995 +0000 UTC m=+0.142953564 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Oct 07 22:21:44 compute-0 nova_compute[192716]: 2025-10-07 22:21:44.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:44 compute-0 nova_compute[192716]: 2025-10-07 22:21:44.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:46 compute-0 nova_compute[192716]: 2025-10-07 22:21:46.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:47 compute-0 nova_compute[192716]: 2025-10-07 22:21:47.705 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:47 compute-0 nova_compute[192716]: 2025-10-07 22:21:47.706 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:48 compute-0 nova_compute[192716]: 2025-10-07 22:21:48.212 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 07 22:21:48 compute-0 nova_compute[192716]: 2025-10-07 22:21:48.775 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:48 compute-0 nova_compute[192716]: 2025-10-07 22:21:48.776 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:48 compute-0 nova_compute[192716]: 2025-10-07 22:21:48.784 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 07 22:21:48 compute-0 nova_compute[192716]: 2025-10-07 22:21:48.785 2 INFO nova.compute.claims [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Claim successful on node compute-0.ctlplane.example.com
Oct 07 22:21:48 compute-0 podman[227971]: 2025-10-07 22:21:48.832331922 +0000 UTC m=+0.066924876 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container)
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.832 2 DEBUG nova.scheduler.client.report [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.854 2 DEBUG nova.scheduler.client.report [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.855 2 DEBUG nova.compute.provider_tree [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.882 2 DEBUG nova.scheduler.client.report [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.905 2 DEBUG nova.scheduler.client.report [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.944 2 DEBUG nova.compute.provider_tree [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:49 compute-0 nova_compute[192716]: 2025-10-07 22:21:49.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.453 2 DEBUG nova.scheduler.client.report [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.501 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.965 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.189s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.966 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.971 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.469s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.971 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:50 compute-0 nova_compute[192716]: 2025-10-07 22:21:50.972 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.214 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.216 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.243 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.243 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5857MB free_disk=73.29894256591797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.244 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.244 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.484 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.485 2 DEBUG nova.network.neutron [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.486 2 WARNING neutronclient.v2_0.client [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.486 2 WARNING neutronclient.v2_0.client [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:21:51 compute-0 nova_compute[192716]: 2025-10-07 22:21:51.998 2 INFO nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 07 22:21:52 compute-0 nova_compute[192716]: 2025-10-07 22:21:52.281 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:21:52 compute-0 nova_compute[192716]: 2025-10-07 22:21:52.281 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:21:52 compute-0 nova_compute[192716]: 2025-10-07 22:21:52.282 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:21:51 up  1:30,  0 user,  load average: 0.02, 0.07, 0.15\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_4afdd5a94b604f91a0cbdb5a281ca0c6': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:21:52 compute-0 nova_compute[192716]: 2025-10-07 22:21:52.329 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:21:52 compute-0 nova_compute[192716]: 2025-10-07 22:21:52.506 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 07 22:21:52 compute-0 nova_compute[192716]: 2025-10-07 22:21:52.835 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.344 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.345 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.528 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.530 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.530 2 INFO nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Creating image(s)
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.531 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "/var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.532 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "/var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.533 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "/var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.534 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.540 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.543 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.606 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.608 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.609 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.610 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.616 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.617 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.679 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.680 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.720 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.721 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.721 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.788 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.790 2 DEBUG nova.virt.disk.api [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Checking if we can resize image /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.790 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.884 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.887 2 DEBUG nova.virt.disk.api [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Cannot resize image /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.888 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.889 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Ensure instance console log exists: /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.890 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.891 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:21:53 compute-0 nova_compute[192716]: 2025-10-07 22:21:53.891 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:21:54 compute-0 nova_compute[192716]: 2025-10-07 22:21:54.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:55 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:55.627 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:21:55 compute-0 nova_compute[192716]: 2025-10-07 22:21:55.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:55 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:21:55.628 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:21:55 compute-0 nova_compute[192716]: 2025-10-07 22:21:55.799 2 DEBUG nova.network.neutron [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Successfully created port: 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.455 2 DEBUG nova.network.neutron [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Successfully updated port: 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.526 2 DEBUG nova.compute.manager [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-changed-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.527 2 DEBUG nova.compute.manager [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Refreshing instance network info cache due to event network-changed-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.527 2 DEBUG oslo_concurrency.lockutils [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.528 2 DEBUG oslo_concurrency.lockutils [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.528 2 DEBUG nova.network.neutron [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Refreshing network info cache for port 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 07 22:21:56 compute-0 nova_compute[192716]: 2025-10-07 22:21:56.963 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "refresh_cache-e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:21:57 compute-0 nova_compute[192716]: 2025-10-07 22:21:57.038 2 WARNING neutronclient.v2_0.client [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:21:57 compute-0 nova_compute[192716]: 2025-10-07 22:21:57.346 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:57 compute-0 nova_compute[192716]: 2025-10-07 22:21:57.347 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:57 compute-0 nova_compute[192716]: 2025-10-07 22:21:57.598 2 DEBUG nova.network.neutron [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:21:57 compute-0 nova_compute[192716]: 2025-10-07 22:21:57.818 2 DEBUG nova.network.neutron [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:21:58 compute-0 nova_compute[192716]: 2025-10-07 22:21:58.326 2 DEBUG oslo_concurrency.lockutils [req-c0f8879a-ef93-45fc-a2fb-61462a770059 req-b9365665-5dfe-4f70-9e79-94f07dd949d7 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:21:58 compute-0 nova_compute[192716]: 2025-10-07 22:21:58.327 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquired lock "refresh_cache-e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:21:58 compute-0 nova_compute[192716]: 2025-10-07 22:21:58.327 2 DEBUG nova.network.neutron [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:21:58 compute-0 nova_compute[192716]: 2025-10-07 22:21:58.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:21:59 compute-0 nova_compute[192716]: 2025-10-07 22:21:59.611 2 DEBUG nova.network.neutron [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 07 22:21:59 compute-0 podman[203153]: time="2025-10-07T22:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:21:59 compute-0 nova_compute[192716]: 2025-10-07 22:21:59.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:21:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:21:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 07 22:21:59 compute-0 nova_compute[192716]: 2025-10-07 22:21:59.911 2 WARNING neutronclient.v2_0.client [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:21:59 compute-0 nova_compute[192716]: 2025-10-07 22:21:59.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:00 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:00.629 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:00 compute-0 nova_compute[192716]: 2025-10-07 22:22:00.789 2 DEBUG nova.network.neutron [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Updating instance_info_cache with network_info: [{"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:22:00 compute-0 nova_compute[192716]: 2025-10-07 22:22:00.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:00 compute-0 nova_compute[192716]: 2025-10-07 22:22:00.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.297 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Releasing lock "refresh_cache-e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.298 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Instance network_info: |[{"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.300 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Start _get_guest_xml network_info=[{"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'image_id': 'c40cab67-7e52-4762-b275-de0efa24bdf4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.303 2 WARNING nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.305 2 DEBUG nova.virt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='c40cab67-7e52-4762-b275-de0efa24bdf4', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1784705288', uuid='e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6'), owner=OwnerMeta(userid='f4a3102f6d8e4d53980ce5f605b5e7db', username='tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin', projectid='4afdd5a94b604f91a0cbdb5a281ca0c6', projectname='tempest-TestExecuteZoneMigrationStrategy-14286770'), image=ImageMeta(id='c40cab67-7e52-4762-b275-de0efa24bdf4', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251007122402.7278e66.el10', creation_time=1759875721.3049765) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.308 2 DEBUG nova.virt.libvirt.host [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.309 2 DEBUG nova.virt.libvirt.host [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.311 2 DEBUG nova.virt.libvirt.host [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.312 2 DEBUG nova.virt.libvirt.host [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.312 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.312 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T21:45:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T21:45:40Z,direct_url=<?>,disk_format='qcow2',id=c40cab67-7e52-4762-b275-de0efa24bdf4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='293ff4341f3d48a4ae100bf4fc7b99bd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T21:45:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.313 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.313 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.313 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.313 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.313 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.314 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.314 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.314 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.314 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.314 2 DEBUG nova.virt.hardware [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.318 2 DEBUG nova.virt.libvirt.vif [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:21:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1784705288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1784705288',id=31,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4afdd5a94b604f91a0cbdb5a281ca0c6',ramdisk_id='',reservation_id='r-pbwh49x2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-14286770',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:21:52Z,user_data=None,user_id='f4a3102f6d8e4d53980ce5f605b5e7db',uuid=e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.319 2 DEBUG nova.network.os_vif_util [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converting VIF {"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.319 2 DEBUG nova.network.os_vif_util [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.320 2 DEBUG nova.objects.instance [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: ERROR   22:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: ERROR   22:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: ERROR   22:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: ERROR   22:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: ERROR   22:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:22:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.498 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 22:22:01 compute-0 podman[228009]: 2025-10-07 22:22:01.825112694 +0000 UTC m=+0.066954747 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.827 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] End _get_guest_xml xml=<domain type="kvm">
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <uuid>e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6</uuid>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <name>instance-0000001f</name>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <memory>131072</memory>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <vcpu>1</vcpu>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <metadata>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:package version="32.1.0-0.20251007122402.7278e66.el10"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1784705288</nova:name>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:creationTime>2025-10-07 22:22:01</nova:creationTime>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:flavor name="m1.nano" id="e6ccb6f6-2ec4-4305-9fdd-4d931e83ed21">
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:memory>128</nova:memory>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:disk>1</nova:disk>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:swap>0</nova:swap>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:ephemeral>0</nova:ephemeral>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:vcpus>1</nova:vcpus>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:extraSpecs>
Oct 07 22:22:01 compute-0 nova_compute[192716]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         </nova:extraSpecs>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       </nova:flavor>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:image uuid="c40cab67-7e52-4762-b275-de0efa24bdf4">
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:minDisk>1</nova:minDisk>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:minRam>0</nova:minRam>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:properties>
Oct 07 22:22:01 compute-0 nova_compute[192716]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         </nova:properties>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       </nova:image>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:owner>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:user uuid="f4a3102f6d8e4d53980ce5f605b5e7db">tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin</nova:user>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:project uuid="4afdd5a94b604f91a0cbdb5a281ca0c6">tempest-TestExecuteZoneMigrationStrategy-14286770</nova:project>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       </nova:owner>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:root type="image" uuid="c40cab67-7e52-4762-b275-de0efa24bdf4"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <nova:ports>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         <nova:port uuid="2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c">
Oct 07 22:22:01 compute-0 nova_compute[192716]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:         </nova:port>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       </nova:ports>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </nova:instance>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </metadata>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <sysinfo type="smbios">
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <system>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <entry name="manufacturer">RDO</entry>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <entry name="product">OpenStack Compute</entry>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <entry name="version">32.1.0-0.20251007122402.7278e66.el10</entry>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <entry name="serial">e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6</entry>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <entry name="uuid">e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6</entry>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <entry name="family">Virtual Machine</entry>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </system>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </sysinfo>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <os>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <boot dev="hd"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <smbios mode="sysinfo"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </os>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <features>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <acpi/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <apic/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <vmcoreinfo/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </features>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <clock offset="utc">
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <timer name="pit" tickpolicy="delay"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <timer name="hpet" present="no"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </clock>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <cpu mode="host-model" match="exact">
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <topology sockets="1" cores="1" threads="1"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </cpu>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   <devices>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <disk type="file" device="disk">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <target dev="vda" bus="virtio"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <disk type="file" device="cdrom">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <driver name="qemu" type="raw" cache="none"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <source file="/var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.config"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <target dev="sda" bus="sata"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </disk>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <interface type="ethernet">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <mac address="fa:16:3e:d8:7f:e0"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <driver name="vhost" rx_queue_size="512"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <mtu size="1442"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <target dev="tap2959c2dd-2b"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </interface>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <serial type="pty">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <log file="/var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/console.log" append="off"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </serial>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <video>
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <model type="virtio"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </video>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <input type="tablet" bus="usb"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <rng model="virtio">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <backend model="random">/dev/urandom</backend>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </rng>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="pci" model="pcie-root-port"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <controller type="usb" index="0"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 07 22:22:01 compute-0 nova_compute[192716]:       <stats period="10"/>
Oct 07 22:22:01 compute-0 nova_compute[192716]:     </memballoon>
Oct 07 22:22:01 compute-0 nova_compute[192716]:   </devices>
Oct 07 22:22:01 compute-0 nova_compute[192716]: </domain>
Oct 07 22:22:01 compute-0 nova_compute[192716]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.828 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Preparing to wait for external event network-vif-plugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.828 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.828 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.829 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.829 2 DEBUG nova.virt.libvirt.vif [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-07T22:21:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1784705288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1784705288',id=31,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4afdd5a94b604f91a0cbdb5a281ca0c6',ramdisk_id='',reservation_id='r-pbwh49x2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-14286770',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:21:52Z,user_data=None,user_id='f4a3102f6d8e4d53980ce5f605b5e7db',uuid=e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.829 2 DEBUG nova.network.os_vif_util [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converting VIF {"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.830 2 DEBUG nova.network.os_vif_util [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.830 2 DEBUG os_vif [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.831 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.831 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.832 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bf6d2883-f1ac-55c4-90e6-a9c216f04f00', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.839 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2959c2dd-2b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2959c2dd-2b, col_values=(('qos', UUID('f68c8be8-16ce-4893-9b27-aa637a9d87b6')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2959c2dd-2b, col_values=(('external_ids', {'iface-id': '2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:7f:e0', 'vm-uuid': 'e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 NetworkManager[51722]: <info>  [1759875721.8439] manager: (tap2959c2dd-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:01 compute-0 nova_compute[192716]: 2025-10-07 22:22:01.849 2 INFO os_vif [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b')
Oct 07 22:22:01 compute-0 podman[228010]: 2025-10-07 22:22:01.856645891 +0000 UTC m=+0.084813771 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 22:22:03 compute-0 nova_compute[192716]: 2025-10-07 22:22:03.395 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:22:03 compute-0 nova_compute[192716]: 2025-10-07 22:22:03.396 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 07 22:22:03 compute-0 nova_compute[192716]: 2025-10-07 22:22:03.396 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] No VIF found with MAC fa:16:3e:d8:7f:e0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 07 22:22:03 compute-0 nova_compute[192716]: 2025-10-07 22:22:03.397 2 INFO nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Using config drive
Oct 07 22:22:03 compute-0 nova_compute[192716]: 2025-10-07 22:22:03.493 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:03 compute-0 nova_compute[192716]: 2025-10-07 22:22:03.908 2 WARNING neutronclient.v2_0.client [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:04 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.680 2 INFO nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Creating config drive at /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.config
Oct 07 22:22:04 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.690 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpmqiatixl execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:04 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.821 2 DEBUG oslo_concurrency.processutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251007122402.7278e66.el10 -quiet -J -r -V config-2 /tmp/tmpmqiatixl" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:04 compute-0 kernel: tap2959c2dd-2b: entered promiscuous mode
Oct 07 22:22:04 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:04 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:04 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:04 compute-0 ovn_controller[94904]: 2025-10-07T22:22:04Z|00264|binding|INFO|Claiming lport 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c for this chassis.
Oct 07 22:22:04 compute-0 ovn_controller[94904]: 2025-10-07T22:22:04Z|00265|binding|INFO|2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c: Claiming fa:16:3e:d8:7f:e0 10.100.0.11
Oct 07 22:22:04 compute-0 NetworkManager[51722]: <info>  [1759875724.9555] manager: (tap2959c2dd-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.955 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:7f:e0 10.100.0.11'], port_security=['fa:16:3e:d8:7f:e0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4afdd5a94b604f91a0cbdb5a281ca0c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4eb226e6-94bc-4936-86e1-40d24d079c69', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e008ff2-8b66-4169-982c-1129d9c90df2, chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.956 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c in datapath 8558ef14-bbda-4677-87c6-5cb9edc8eae5 bound to our chassis
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.958 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8558ef14-bbda-4677-87c6-5cb9edc8eae5
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.972 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a799ba0a-ca77-4304-81a2-c7f89ce997cc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.973 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8558ef14-b1 in ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.977 214116 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8558ef14-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.978 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a46cb858-6a75-40a6-8b3b-100bfd4e29e6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:04 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:04.979 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[626c20c5-b8fd-4832-af21-34b2a7696510]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:04.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.001 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[0773443a-f6f9-4159-ba2c-3860150743e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_controller[94904]: 2025-10-07T22:22:05Z|00266|binding|INFO|Setting lport 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c ovn-installed in OVS
Oct 07 22:22:05 compute-0 ovn_controller[94904]: 2025-10-07T22:22:05Z|00267|binding|INFO|Setting lport 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c up in Southbound
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:05 compute-0 systemd-machined[152719]: New machine qemu-23-instance-0000001f.
Oct 07 22:22:05 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001f.
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.022 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[bfda6c54-e70e-4dc8-a0f8-6528ef10b627]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 podman[228062]: 2025-10-07 22:22:05.034178821 +0000 UTC m=+0.116012539 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:22:05 compute-0 systemd-udevd[228100]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.061 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[cf60cd6c-c4ac-47de-8a9c-d24fa4f70df2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 NetworkManager[51722]: <info>  [1759875725.0669] device (tap2959c2dd-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:22:05 compute-0 NetworkManager[51722]: <info>  [1759875725.0679] device (tap2959c2dd-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.070 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7172a4-c7d1-4847-ac1b-867b1108ca87]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 NetworkManager[51722]: <info>  [1759875725.0712] manager: (tap8558ef14-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.118 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[de870a91-6d7d-44f7-926b-b2c05b1080ac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.121 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[0a201bd2-e0d3-43d8-b4b8-f2073268fe82]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 NetworkManager[51722]: <info>  [1759875725.1503] device (tap8558ef14-b0): carrier: link connected
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.158 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b0003233-fa63-4a23-aab1-97868be5d0dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.179 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c60e55-3e57-4234-912a-d44ba3cbe266]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8558ef14-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:0b:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545873, 'reachable_time': 35614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228126, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.186 2 DEBUG nova.compute.manager [req-1e7752e0-fd69-43f6-bf4d-e49f73b2c232 req-4ce5749e-a3b3-4adf-b8ca-1ed796c61143 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-plugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.186 2 DEBUG oslo_concurrency.lockutils [req-1e7752e0-fd69-43f6-bf4d-e49f73b2c232 req-4ce5749e-a3b3-4adf-b8ca-1ed796c61143 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.187 2 DEBUG oslo_concurrency.lockutils [req-1e7752e0-fd69-43f6-bf4d-e49f73b2c232 req-4ce5749e-a3b3-4adf-b8ca-1ed796c61143 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.187 2 DEBUG oslo_concurrency.lockutils [req-1e7752e0-fd69-43f6-bf4d-e49f73b2c232 req-4ce5749e-a3b3-4adf-b8ca-1ed796c61143 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.187 2 DEBUG nova.compute.manager [req-1e7752e0-fd69-43f6-bf4d-e49f73b2c232 req-4ce5749e-a3b3-4adf-b8ca-1ed796c61143 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Processing event network-vif-plugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.200 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[03c95f89-3270-41f4-a5ea-0ae0659a53c5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:b74'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545873, 'tstamp': 545873}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228127, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.224 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac17d94-b28d-4375-81f2-f78a767fb4ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8558ef14-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:0b:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545873, 'reachable_time': 35614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228128, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.268 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd98605-ec25-4746-a3df-7012a113a06f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.350 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[94c6f0fd-8c09-4082-bbe5-414c6346e421]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.352 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8558ef14-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.352 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.352 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8558ef14-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:05 compute-0 NetworkManager[51722]: <info>  [1759875725.3552] manager: (tap8558ef14-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct 07 22:22:05 compute-0 kernel: tap8558ef14-b0: entered promiscuous mode
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.357 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8558ef14-b0, col_values=(('external_ids', {'iface-id': 'ffdd001d-1d41-425c-92ed-0208c7cec9cd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:05 compute-0 ovn_controller[94904]: 2025-10-07T22:22:05Z|00268|binding|INFO|Releasing lport ffdd001d-1d41-425c-92ed-0208c7cec9cd from this chassis (sb_readonly=0)
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.361 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[7eef5b00-8a74-41ff-b19d-dd367e4b3055]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.363 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.363 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.363 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8558ef14-bbda-4677-87c6-5cb9edc8eae5 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.363 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.364 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[13899955-39f1-4ca9-aadc-ed38c7d012ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.364 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.365 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[09ee2b44-64ef-448b-b1f7-e4c201021953]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.365 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: global
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     log         /dev/log local0 debug
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     log-tag     haproxy-metadata-proxy-8558ef14-bbda-4677-87c6-5cb9edc8eae5
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     user        root
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     group       root
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     maxconn     1024
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     pidfile     /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     daemon
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: defaults
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     log global
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     mode http
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     option httplog
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     option dontlognull
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     option http-server-close
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     option forwardfor
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     retries                 3
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     timeout http-request    30s
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     timeout connect         30s
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     timeout client          32s
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     timeout server          32s
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     timeout http-keep-alive 30s
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: listen listener
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     bind 169.254.169.254:80
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     server metadata /var/lib/neutron/metadata_proxy
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:     http-request add-header X-OVN-Network-ID 8558ef14-bbda-4677-87c6-5cb9edc8eae5
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 07 22:22:05 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:05.367 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'env', 'PROCESS_TAG=haproxy-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8558ef14-bbda-4677-87c6-5cb9edc8eae5.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:05 compute-0 podman[228167]: 2025-10-07 22:22:05.807360064 +0000 UTC m=+0.074896446 container create e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:22:05 compute-0 podman[228167]: 2025-10-07 22:22:05.763339997 +0000 UTC m=+0.030876439 image pull 24d4277b41bbd1d97b6f360ea068040fe96182680512bacad34d1f578f4798a9 38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 07 22:22:05 compute-0 systemd[1]: Started libpod-conmon-e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988.scope.
Oct 07 22:22:05 compute-0 systemd[1]: Started libcrun container.
Oct 07 22:22:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e99da351bc16b134d315c3fd07d21ff5672ba3fe082522e97ca2ee97d93fedb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 07 22:22:05 compute-0 podman[228167]: 2025-10-07 22:22:05.935010146 +0000 UTC m=+0.202546528 container init e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 22:22:05 compute-0 podman[228167]: 2025-10-07 22:22:05.941134902 +0000 UTC m=+0.208671284 container start e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 22:22:05 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [NOTICE]   (228187) : New worker (228189) forked
Oct 07 22:22:05 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [NOTICE]   (228187) : Loading success.
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.982 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.987 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.994 2 INFO nova.virt.libvirt.driver [-] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Instance spawned successfully.
Oct 07 22:22:05 compute-0 nova_compute[192716]: 2025-10-07 22:22:05.994 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.507 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.508 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.508 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.509 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.509 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.509 2 DEBUG nova.virt.libvirt.driver [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 07 22:22:06 compute-0 nova_compute[192716]: 2025-10-07 22:22:06.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.021 2 INFO nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Took 13.49 seconds to spawn the instance on the hypervisor.
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.022 2 DEBUG nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.261 2 DEBUG nova.compute.manager [req-9db19343-b5a9-4661-aa95-dc6bb5ba2497 req-5abdd26d-8a6e-4347-80fc-182668b79792 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-plugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.262 2 DEBUG oslo_concurrency.lockutils [req-9db19343-b5a9-4661-aa95-dc6bb5ba2497 req-5abdd26d-8a6e-4347-80fc-182668b79792 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.262 2 DEBUG oslo_concurrency.lockutils [req-9db19343-b5a9-4661-aa95-dc6bb5ba2497 req-5abdd26d-8a6e-4347-80fc-182668b79792 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.263 2 DEBUG oslo_concurrency.lockutils [req-9db19343-b5a9-4661-aa95-dc6bb5ba2497 req-5abdd26d-8a6e-4347-80fc-182668b79792 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.263 2 DEBUG nova.compute.manager [req-9db19343-b5a9-4661-aa95-dc6bb5ba2497 req-5abdd26d-8a6e-4347-80fc-182668b79792 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] No waiting events found dispatching network-vif-plugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.264 2 WARNING nova.compute.manager [req-9db19343-b5a9-4661-aa95-dc6bb5ba2497 req-5abdd26d-8a6e-4347-80fc-182668b79792 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received unexpected event network-vif-plugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c for instance with vm_state active and task_state None.
Oct 07 22:22:07 compute-0 nova_compute[192716]: 2025-10-07 22:22:07.556 2 INFO nova.compute.manager [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Took 18.83 seconds to build instance.
Oct 07 22:22:08 compute-0 nova_compute[192716]: 2025-10-07 22:22:08.062 2 DEBUG oslo_concurrency.lockutils [None req-c071394c-765b-4ce6-a1d2-12872dbda873 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.356s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:11 compute-0 nova_compute[192716]: 2025-10-07 22:22:11.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:11 compute-0 nova_compute[192716]: 2025-10-07 22:22:11.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:13 compute-0 podman[228199]: 2025-10-07 22:22:13.823157742 +0000 UTC m=+0.057032542 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:22:13 compute-0 podman[228198]: 2025-10-07 22:22:13.866292043 +0000 UTC m=+0.105062374 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct 07 22:22:16 compute-0 nova_compute[192716]: 2025-10-07 22:22:16.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:16 compute-0 nova_compute[192716]: 2025-10-07 22:22:16.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:16 compute-0 nova_compute[192716]: 2025-10-07 22:22:16.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:16 compute-0 nova_compute[192716]: 2025-10-07 22:22:16.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 22:22:17 compute-0 ovn_controller[94904]: 2025-10-07T22:22:17Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:7f:e0 10.100.0.11
Oct 07 22:22:17 compute-0 ovn_controller[94904]: 2025-10-07T22:22:17Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:7f:e0 10.100.0.11
Oct 07 22:22:19 compute-0 podman[228260]: 2025-10-07 22:22:19.8402539 +0000 UTC m=+0.075177494 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct 07 22:22:20 compute-0 nova_compute[192716]: 2025-10-07 22:22:20.767 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Creating tmpfile /var/lib/nova/instances/tmpvxhrihd5 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 07 22:22:20 compute-0 nova_compute[192716]: 2025-10-07 22:22:20.768 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:20 compute-0 nova_compute[192716]: 2025-10-07 22:22:20.782 2 DEBUG nova.compute.manager [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvxhrihd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 07 22:22:21 compute-0 nova_compute[192716]: 2025-10-07 22:22:21.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:21 compute-0 nova_compute[192716]: 2025-10-07 22:22:21.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:22 compute-0 nova_compute[192716]: 2025-10-07 22:22:22.836 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:23 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 22:22:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:25.665 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:25.665 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:25.666 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:26 compute-0 nova_compute[192716]: 2025-10-07 22:22:26.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:26 compute-0 nova_compute[192716]: 2025-10-07 22:22:26.833 2 DEBUG nova.compute.manager [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvxhrihd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 07 22:22:26 compute-0 nova_compute[192716]: 2025-10-07 22:22:26.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:27 compute-0 nova_compute[192716]: 2025-10-07 22:22:27.855 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:22:27 compute-0 nova_compute[192716]: 2025-10-07 22:22:27.856 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:22:27 compute-0 nova_compute[192716]: 2025-10-07 22:22:27.857 2 DEBUG nova.network.neutron [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:22:28 compute-0 nova_compute[192716]: 2025-10-07 22:22:28.365 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.003 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.238 2 DEBUG nova.network.neutron [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Updating instance_info_cache with network_info: [{"id": "ef354981-8b70-4bdd-ae10-8e0958d99587", "address": "fa:16:3e:25:29:e3", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef354981-8b", "ovs_interfaceid": "ef354981-8b70-4bdd-ae10-8e0958d99587", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.746 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:22:29 compute-0 podman[203153]: time="2025-10-07T22:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:22:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.765 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvxhrihd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.766 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Creating instance directory: /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.766 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Creating disk.info with the contents: {'/var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk': 'qcow2', '/var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.767 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 07 22:22:29 compute-0 nova_compute[192716]: 2025-10-07 22:22:29.768 2 DEBUG nova.objects.instance [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:22:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3492 "" "Go-http-client/1.1"
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.276 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.283 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.286 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.346 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.347 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.348 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.349 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.353 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.353 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.409 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.410 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.454 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71,backing_fmt=raw /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.455 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "d6e39bcacac0723e2db68f8e723b02146a6dcb71" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.456 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.544 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d6e39bcacac0723e2db68f8e723b02146a6dcb71 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.547 2 DEBUG nova.virt.disk.api [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Checking if we can resize image /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.548 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.608 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.609 2 DEBUG nova.virt.disk.api [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Cannot resize image /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 07 22:22:30 compute-0 nova_compute[192716]: 2025-10-07 22:22:30.610 2 DEBUG nova.objects.instance [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lazy-loading 'migration_context' on Instance uuid b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.119 2 DEBUG nova.objects.base [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Object Instance<b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.120 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.151 2 DEBUG oslo_concurrency.processutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk.config 497664" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.153 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.155 2 DEBUG nova.virt.libvirt.vif [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-07T22:21:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1708960416',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1708960416',id=30,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:21:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4afdd5a94b604f91a0cbdb5a281ca0c6',ramdisk_id='',reservation_id='r-u0lf1aid',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-14286770',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T22:21:43Z,user_data=None,user_id='f4a3102f6d8e4d53980ce5f605b5e7db',uuid=b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef354981-8b70-4bdd-ae10-8e0958d99587", "address": "fa:16:3e:25:29:e3", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapef354981-8b", "ovs_interfaceid": "ef354981-8b70-4bdd-ae10-8e0958d99587", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.156 2 DEBUG nova.network.os_vif_util [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converting VIF {"id": "ef354981-8b70-4bdd-ae10-8e0958d99587", "address": "fa:16:3e:25:29:e3", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapef354981-8b", "ovs_interfaceid": "ef354981-8b70-4bdd-ae10-8e0958d99587", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.157 2 DEBUG nova.network.os_vif_util [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:29:e3,bridge_name='br-int',has_traffic_filtering=True,id=ef354981-8b70-4bdd-ae10-8e0958d99587,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef354981-8b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.158 2 DEBUG os_vif [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:29:e3,bridge_name='br-int',has_traffic_filtering=True,id=ef354981-8b70-4bdd-ae10-8e0958d99587,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef354981-8b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.161 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.162 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.164 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ddcc074a-f9a9-54c6-9ed1-567b57b1f5ce', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.172 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef354981-8b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapef354981-8b, col_values=(('qos', UUID('5350d1a9-0fbe-4bf4-90d9-b76fc507381a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapef354981-8b, col_values=(('external_ids', {'iface-id': 'ef354981-8b70-4bdd-ae10-8e0958d99587', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:29:e3', 'vm-uuid': 'b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 NetworkManager[51722]: <info>  [1759875751.1754] manager: (tapef354981-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.186 2 INFO os_vif [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:29:e3,bridge_name='br-int',has_traffic_filtering=True,id=ef354981-8b70-4bdd-ae10-8e0958d99587,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef354981-8b')
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.187 2 DEBUG nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.187 2 DEBUG nova.compute.manager [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvxhrihd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.188 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: ERROR   22:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: ERROR   22:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: ERROR   22:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: ERROR   22:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: ERROR   22:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:22:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:22:31 compute-0 nova_compute[192716]: 2025-10-07 22:22:31.615 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:32 compute-0 nova_compute[192716]: 2025-10-07 22:22:32.662 2 DEBUG nova.network.neutron [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Port ef354981-8b70-4bdd-ae10-8e0958d99587 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 07 22:22:32 compute-0 nova_compute[192716]: 2025-10-07 22:22:32.674 2 DEBUG nova.compute.manager [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvxhrihd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 07 22:22:32 compute-0 podman[228305]: 2025-10-07 22:22:32.825752126 +0000 UTC m=+0.061988095 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct 07 22:22:32 compute-0 podman[228306]: 2025-10-07 22:22:32.834422595 +0000 UTC m=+0.067314007 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 22:22:35 compute-0 ovn_controller[94904]: 2025-10-07T22:22:35Z|00269|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 07 22:22:35 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 07 22:22:35 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 07 22:22:35 compute-0 podman[228344]: 2025-10-07 22:22:35.739405335 +0000 UTC m=+0.069233182 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:22:35 compute-0 NetworkManager[51722]: <info>  [1759875755.8935] manager: (tapef354981-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Oct 07 22:22:35 compute-0 kernel: tapef354981-8b: entered promiscuous mode
Oct 07 22:22:35 compute-0 ovn_controller[94904]: 2025-10-07T22:22:35Z|00270|binding|INFO|Claiming lport ef354981-8b70-4bdd-ae10-8e0958d99587 for this additional chassis.
Oct 07 22:22:35 compute-0 ovn_controller[94904]: 2025-10-07T22:22:35Z|00271|binding|INFO|ef354981-8b70-4bdd-ae10-8e0958d99587: Claiming fa:16:3e:25:29:e3 10.100.0.13
Oct 07 22:22:35 compute-0 nova_compute[192716]: 2025-10-07 22:22:35.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:35 compute-0 systemd-udevd[228397]: Network interface NamePolicy= disabled on kernel command line.
Oct 07 22:22:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:35.956 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:29:e3 10.100.0.13'], port_security=['fa:16:3e:25:29:e3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4afdd5a94b604f91a0cbdb5a281ca0c6', 'neutron:revision_number': '10', 'neutron:security_group_ids': '4eb226e6-94bc-4936-86e1-40d24d079c69', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e008ff2-8b66-4169-982c-1129d9c90df2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ef354981-8b70-4bdd-ae10-8e0958d99587) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:22:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:35.956 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ef354981-8b70-4bdd-ae10-8e0958d99587 in datapath 8558ef14-bbda-4677-87c6-5cb9edc8eae5 unbound from our chassis
Oct 07 22:22:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:35.957 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8558ef14-bbda-4677-87c6-5cb9edc8eae5
Oct 07 22:22:35 compute-0 nova_compute[192716]: 2025-10-07 22:22:35.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:35 compute-0 ovn_controller[94904]: 2025-10-07T22:22:35Z|00272|binding|INFO|Setting lport ef354981-8b70-4bdd-ae10-8e0958d99587 ovn-installed in OVS
Oct 07 22:22:35 compute-0 nova_compute[192716]: 2025-10-07 22:22:35.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:35 compute-0 nova_compute[192716]: 2025-10-07 22:22:35.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:35 compute-0 NetworkManager[51722]: <info>  [1759875755.9712] device (tapef354981-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 07 22:22:35 compute-0 NetworkManager[51722]: <info>  [1759875755.9727] device (tapef354981-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 07 22:22:35 compute-0 systemd-machined[152719]: New machine qemu-24-instance-0000001e.
Oct 07 22:22:35 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:35.978 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[b374ffe4-5763-48c7-bba8-65ac09d97b91]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:35 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000001e.
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.017 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd02de9-f8bb-415b-83ce-2e73a63aa63a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.020 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a075c9-1e98-4d64-9a30-22b05f5ab010]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.059 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb6a73e-d9ea-457f-b5e6-4df813349b54]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.082 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[146a3855-a025-459f-a4a6-7223a9c7d22f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8558ef14-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:0b:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545873, 'reachable_time': 35614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228413, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.103 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[a51270f0-8b08-412c-9585-03f775a30095]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8558ef14-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545888, 'tstamp': 545888}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228415, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8558ef14-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545892, 'tstamp': 545892}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228415, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.105 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8558ef14-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:36 compute-0 nova_compute[192716]: 2025-10-07 22:22:36.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:36 compute-0 nova_compute[192716]: 2025-10-07 22:22:36.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.108 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8558ef14-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.109 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.110 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8558ef14-b0, col_values=(('external_ids', {'iface-id': 'ffdd001d-1d41-425c-92ed-0208c7cec9cd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.110 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:36 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:36.112 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[2b02d3f3-4591-49d8-9c7f-a53dced0a850]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8558ef14-bbda-4677-87c6-5cb9edc8eae5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8558ef14-bbda-4677-87c6-5cb9edc8eae5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:36 compute-0 nova_compute[192716]: 2025-10-07 22:22:36.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:36 compute-0 nova_compute[192716]: 2025-10-07 22:22:36.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:38.945 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:22:38 compute-0 nova_compute[192716]: 2025-10-07 22:22:38.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:38 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:38.946 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:22:38 compute-0 ovn_controller[94904]: 2025-10-07T22:22:38Z|00273|binding|INFO|Claiming lport ef354981-8b70-4bdd-ae10-8e0958d99587 for this chassis.
Oct 07 22:22:38 compute-0 ovn_controller[94904]: 2025-10-07T22:22:38Z|00274|binding|INFO|ef354981-8b70-4bdd-ae10-8e0958d99587: Claiming fa:16:3e:25:29:e3 10.100.0.13
Oct 07 22:22:38 compute-0 ovn_controller[94904]: 2025-10-07T22:22:38Z|00275|binding|INFO|Setting lport ef354981-8b70-4bdd-ae10-8e0958d99587 up in Southbound
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.111 2 INFO nova.compute.manager [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Post operation of migration started
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.112 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.659 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.660 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.804 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "refresh_cache-b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.804 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquired lock "refresh_cache-b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 07 22:22:40 compute-0 nova_compute[192716]: 2025-10-07 22:22:40.805 2 DEBUG nova.network.neutron [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 07 22:22:41 compute-0 nova_compute[192716]: 2025-10-07 22:22:41.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:41 compute-0 nova_compute[192716]: 2025-10-07 22:22:41.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:41 compute-0 nova_compute[192716]: 2025-10-07 22:22:41.312 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:41 compute-0 nova_compute[192716]: 2025-10-07 22:22:41.992 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:42 compute-0 nova_compute[192716]: 2025-10-07 22:22:42.142 2 DEBUG nova.network.neutron [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Updating instance_info_cache with network_info: [{"id": "ef354981-8b70-4bdd-ae10-8e0958d99587", "address": "fa:16:3e:25:29:e3", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef354981-8b", "ovs_interfaceid": "ef354981-8b70-4bdd-ae10-8e0958d99587", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:22:42 compute-0 nova_compute[192716]: 2025-10-07 22:22:42.498 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:42 compute-0 nova_compute[192716]: 2025-10-07 22:22:42.499 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:22:42 compute-0 nova_compute[192716]: 2025-10-07 22:22:42.651 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Releasing lock "refresh_cache-b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 07 22:22:43 compute-0 nova_compute[192716]: 2025-10-07 22:22:43.177 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:43 compute-0 nova_compute[192716]: 2025-10-07 22:22:43.178 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:43 compute-0 nova_compute[192716]: 2025-10-07 22:22:43.178 2 DEBUG oslo_concurrency.lockutils [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:43 compute-0 nova_compute[192716]: 2025-10-07 22:22:43.182 2 INFO nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 07 22:22:43 compute-0 virtqemud[192532]: Domain id=24 name='instance-0000001e' uuid=b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 is tainted: custom-monitor
Oct 07 22:22:43 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:43.951 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:44 compute-0 nova_compute[192716]: 2025-10-07 22:22:44.191 2 INFO nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 07 22:22:44 compute-0 podman[228437]: 2025-10-07 22:22:44.884541621 +0000 UTC m=+0.105082814 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 07 22:22:44 compute-0 podman[228436]: 2025-10-07 22:22:44.970805123 +0000 UTC m=+0.191755118 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 07 22:22:44 compute-0 nova_compute[192716]: 2025-10-07 22:22:44.992 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:45 compute-0 nova_compute[192716]: 2025-10-07 22:22:45.199 2 INFO nova.virt.libvirt.driver [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 07 22:22:45 compute-0 nova_compute[192716]: 2025-10-07 22:22:45.206 2 DEBUG nova.compute.manager [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 07 22:22:45 compute-0 nova_compute[192716]: 2025-10-07 22:22:45.716 2 DEBUG nova.objects.instance [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 07 22:22:46 compute-0 nova_compute[192716]: 2025-10-07 22:22:46.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:46 compute-0 nova_compute[192716]: 2025-10-07 22:22:46.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:46 compute-0 nova_compute[192716]: 2025-10-07 22:22:46.734 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:47 compute-0 nova_compute[192716]: 2025-10-07 22:22:47.643 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:47 compute-0 nova_compute[192716]: 2025-10-07 22:22:47.773 2 WARNING neutronclient.v2_0.client [None req-8fdff819-e5de-49d1-812f-730fb9a08406 28f4dcc77fea41a3a7deeefb3920cec7 3e3454b1c5584c5182164bbb63015c96 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:48 compute-0 nova_compute[192716]: 2025-10-07 22:22:48.729 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.242 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Triggering sync for uuid b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.243 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Triggering sync for uuid e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.244 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.244 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.245 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.246 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.756 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.512s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:49 compute-0 nova_compute[192716]: 2025-10-07 22:22:49.758 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.512s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.100 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.101 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.102 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.102 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.103 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.122 2 INFO nova.compute.manager [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Terminating instance
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.636 2 DEBUG nova.compute.manager [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:22:50 compute-0 kernel: tap2959c2dd-2b (unregistering): left promiscuous mode
Oct 07 22:22:50 compute-0 NetworkManager[51722]: <info>  [1759875770.6624] device (tap2959c2dd-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:22:50 compute-0 ovn_controller[94904]: 2025-10-07T22:22:50Z|00276|binding|INFO|Releasing lport 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c from this chassis (sb_readonly=0)
Oct 07 22:22:50 compute-0 ovn_controller[94904]: 2025-10-07T22:22:50Z|00277|binding|INFO|Setting lport 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c down in Southbound
Oct 07 22:22:50 compute-0 ovn_controller[94904]: 2025-10-07T22:22:50Z|00278|binding|INFO|Removing iface tap2959c2dd-2b ovn-installed in OVS
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.690 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:7f:e0 10.100.0.11'], port_security=['fa:16:3e:d8:7f:e0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4afdd5a94b604f91a0cbdb5a281ca0c6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4eb226e6-94bc-4936-86e1-40d24d079c69', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e008ff2-8b66-4169-982c-1129d9c90df2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.691 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c in datapath 8558ef14-bbda-4677-87c6-5cb9edc8eae5 unbound from our chassis
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.693 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8558ef14-bbda-4677-87c6-5cb9edc8eae5
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.717 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[8406fbf6-4b7c-4ac5-93cf-eec9dc1c4da9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Oct 07 22:22:50 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001f.scope: Consumed 13.756s CPU time.
Oct 07 22:22:50 compute-0 systemd-machined[152719]: Machine qemu-23-instance-0000001f terminated.
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.770 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[9575a282-8b09-4071-a169-90eaeeaa18ea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.774 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[5675e2f8-9c4e-49aa-8b8c-10f04023e702]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 podman[228483]: 2025-10-07 22:22:50.789180095 +0000 UTC m=+0.100382978 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.816 215740 DEBUG oslo.privsep.daemon [-] privsep: reply[d3302816-7950-4c61-8095-ae669dacd1d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.837 2 DEBUG nova.compute.manager [req-1034f76f-0302-4bbf-a0d6-8832cc64b952 req-bd8cd25f-57af-446e-bbc1-dbe846f5ebfc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-unplugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.838 2 DEBUG oslo_concurrency.lockutils [req-1034f76f-0302-4bbf-a0d6-8832cc64b952 req-bd8cd25f-57af-446e-bbc1-dbe846f5ebfc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.838 2 DEBUG oslo_concurrency.lockutils [req-1034f76f-0302-4bbf-a0d6-8832cc64b952 req-bd8cd25f-57af-446e-bbc1-dbe846f5ebfc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.839 2 DEBUG oslo_concurrency.lockutils [req-1034f76f-0302-4bbf-a0d6-8832cc64b952 req-bd8cd25f-57af-446e-bbc1-dbe846f5ebfc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.839 2 DEBUG nova.compute.manager [req-1034f76f-0302-4bbf-a0d6-8832cc64b952 req-bd8cd25f-57af-446e-bbc1-dbe846f5ebfc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] No waiting events found dispatching network-vif-unplugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.839 2 DEBUG nova.compute.manager [req-1034f76f-0302-4bbf-a0d6-8832cc64b952 req-bd8cd25f-57af-446e-bbc1-dbe846f5ebfc 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-unplugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.838 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[15e10c3d-09ca-49d6-b113-38bfc334ad2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8558ef14-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:0b:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545873, 'reachable_time': 35614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228517, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.864 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[34cf0c09-7ee2-45a7-9f99-feee47e7217a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8558ef14-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545888, 'tstamp': 545888}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228518, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8558ef14-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545892, 'tstamp': 545892}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228518, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.866 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8558ef14-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.875 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8558ef14-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.876 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.876 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8558ef14-b0, col_values=(('external_ids', {'iface-id': 'ffdd001d-1d41-425c-92ed-0208c7cec9cd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.876 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 07 22:22:50 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:50.878 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7e16ee-020c-40de-80d9-b7a1ceab1a2b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8558ef14-bbda-4677-87c6-5cb9edc8eae5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8558ef14-bbda-4677-87c6-5cb9edc8eae5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.922 2 INFO nova.virt.libvirt.driver [-] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Instance destroyed successfully.
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.923 2 DEBUG nova.objects.instance [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lazy-loading 'resources' on Instance uuid e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:22:50 compute-0 nova_compute[192716]: 2025-10-07 22:22:50.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.430 2 DEBUG nova.virt.libvirt.vif [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-07T22:21:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1784705288',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1784705288',id=31,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:22:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4afdd5a94b604f91a0cbdb5a281ca0c6',ramdisk_id='',reservation_id='r-pbwh49x2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-14286770',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:22:07Z,user_data=None,user_id='f4a3102f6d8e4d53980ce5f605b5e7db',uuid=e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.431 2 DEBUG nova.network.os_vif_util [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converting VIF {"id": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "address": "fa:16:3e:d8:7f:e0", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2959c2dd-2b", "ovs_interfaceid": "2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.432 2 DEBUG nova.network.os_vif_util [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.432 2 DEBUG os_vif [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.435 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2959c2dd-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.440 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f68c8be8-16ce-4893-9b27-aa637a9d87b6) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.443 2 INFO os_vif [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7f:e0,bridge_name='br-int',has_traffic_filtering=True,id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2959c2dd-2b')
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.444 2 INFO nova.virt.libvirt.driver [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Deleting instance files /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6_del
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.445 2 INFO nova.virt.libvirt.driver [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Deletion of /var/lib/nova/instances/e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6_del complete
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.507 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.507 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.960 2 INFO nova.compute.manager [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.961 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.962 2 DEBUG nova.compute.manager [-] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.962 2 DEBUG nova.network.neutron [-] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:22:51 compute-0 nova_compute[192716]: 2025-10-07 22:22:51.962 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.554 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Error from libvirt while getting description of instance-0000001f: [Error Code 42] Domain not found: no domain with matching uuid 'e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6' (instance-0000001f): libvirt.libvirtError: Domain not found: no domain with matching uuid 'e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6' (instance-0000001f)
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.560 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.627 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.629 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.692 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.697 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.863 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.865 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.899 2 DEBUG nova.compute.manager [req-fe913345-6938-491f-924f-c24ab7deb63c req-819f5c30-9fec-461f-9464-07ebf0c70c32 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-unplugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.899 2 DEBUG oslo_concurrency.lockutils [req-fe913345-6938-491f-924f-c24ab7deb63c req-819f5c30-9fec-461f-9464-07ebf0c70c32 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.900 2 DEBUG oslo_concurrency.lockutils [req-fe913345-6938-491f-924f-c24ab7deb63c req-819f5c30-9fec-461f-9464-07ebf0c70c32 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.900 2 DEBUG oslo_concurrency.lockutils [req-fe913345-6938-491f-924f-c24ab7deb63c req-819f5c30-9fec-461f-9464-07ebf0c70c32 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.900 2 DEBUG nova.compute.manager [req-fe913345-6938-491f-924f-c24ab7deb63c req-819f5c30-9fec-461f-9464-07ebf0c70c32 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] No waiting events found dispatching network-vif-unplugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.900 2 DEBUG nova.compute.manager [req-fe913345-6938-491f-924f-c24ab7deb63c req-819f5c30-9fec-461f-9464-07ebf0c70c32 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-unplugged-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.906 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.907 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=73.26979446411133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.907 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:52 compute-0 nova_compute[192716]: 2025-10-07 22:22:52.907 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:53 compute-0 nova_compute[192716]: 2025-10-07 22:22:53.004 2 DEBUG nova.compute.manager [req-0fcef0a4-337d-4534-aed3-9f68ef111ae1 req-8b4cb906-aff3-4a49-b16e-180d64923ef2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Received event network-vif-deleted-2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:22:53 compute-0 nova_compute[192716]: 2025-10-07 22:22:53.004 2 INFO nova.compute.manager [req-0fcef0a4-337d-4534-aed3-9f68ef111ae1 req-8b4cb906-aff3-4a49-b16e-180d64923ef2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Neutron deleted interface 2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c; detaching it from the instance and deleting it from the info cache
Oct 07 22:22:53 compute-0 nova_compute[192716]: 2025-10-07 22:22:53.005 2 DEBUG nova.network.neutron [req-0fcef0a4-337d-4534-aed3-9f68ef111ae1 req-8b4cb906-aff3-4a49-b16e-180d64923ef2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:22:53 compute-0 nova_compute[192716]: 2025-10-07 22:22:53.426 2 DEBUG nova.network.neutron [-] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:22:53 compute-0 nova_compute[192716]: 2025-10-07 22:22:53.666 2 DEBUG nova.compute.manager [req-0fcef0a4-337d-4534-aed3-9f68ef111ae1 req-8b4cb906-aff3-4a49-b16e-180d64923ef2 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Detach interface failed, port_id=2959c2dd-2b57-4bc3-91bd-f44a30dd8a9c, reason: Instance e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.164 2 INFO nova.compute.manager [-] [instance: e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6] Took 2.20 seconds to deallocate network for instance.
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.685 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.720 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.720 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Instance b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.721 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.721 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:22:52 up  1:31,  0 user,  load average: 0.61, 0.22, 0.20\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_4afdd5a94b604f91a0cbdb5a281ca0c6': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:22:54 compute-0 nova_compute[192716]: 2025-10-07 22:22:54.830 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:22:55 compute-0 nova_compute[192716]: 2025-10-07 22:22:55.337 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:22:55 compute-0 nova_compute[192716]: 2025-10-07 22:22:55.848 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:22:55 compute-0 nova_compute[192716]: 2025-10-07 22:22:55.849 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.941s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:55 compute-0 nova_compute[192716]: 2025-10-07 22:22:55.849 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.165s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:55 compute-0 nova_compute[192716]: 2025-10-07 22:22:55.924 2 DEBUG nova.compute.provider_tree [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.433 2 DEBUG nova.scheduler.client.report [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.845 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.846 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.943 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:56 compute-0 nova_compute[192716]: 2025-10-07 22:22:56.970 2 INFO nova.scheduler.client.report [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Deleted allocations for instance e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6
Oct 07 22:22:57 compute-0 nova_compute[192716]: 2025-10-07 22:22:57.998 2 DEBUG oslo_concurrency.lockutils [None req-04e695f4-1e15-42b8-adc6-71e89d2a5c4e f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "e7b1ccef-2795-4dd2-9a2f-99d55c71d4c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.897s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.555 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.555 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.556 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.556 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.557 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.570 2 INFO nova.compute.manager [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Terminating instance
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:58 compute-0 nova_compute[192716]: 2025-10-07 22:22:58.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.086 2 DEBUG nova.compute.manager [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 07 22:22:59 compute-0 kernel: tapef354981-8b (unregistering): left promiscuous mode
Oct 07 22:22:59 compute-0 NetworkManager[51722]: <info>  [1759875779.1236] device (tapef354981-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 ovn_controller[94904]: 2025-10-07T22:22:59Z|00279|binding|INFO|Releasing lport ef354981-8b70-4bdd-ae10-8e0958d99587 from this chassis (sb_readonly=0)
Oct 07 22:22:59 compute-0 ovn_controller[94904]: 2025-10-07T22:22:59Z|00280|binding|INFO|Setting lport ef354981-8b70-4bdd-ae10-8e0958d99587 down in Southbound
Oct 07 22:22:59 compute-0 ovn_controller[94904]: 2025-10-07T22:22:59Z|00281|binding|INFO|Removing iface tapef354981-8b ovn-installed in OVS
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.142 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:29:e3 10.100.0.13'], port_security=['fa:16:3e:25:29:e3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4afdd5a94b604f91a0cbdb5a281ca0c6', 'neutron:revision_number': '15', 'neutron:security_group_ids': '4eb226e6-94bc-4936-86e1-40d24d079c69', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e008ff2-8b66-4169-982c-1129d9c90df2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f071072e510>], logical_port=ef354981-8b70-4bdd-ae10-8e0958d99587) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f071072e510>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.143 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ef354981-8b70-4bdd-ae10-8e0958d99587 in datapath 8558ef14-bbda-4677-87c6-5cb9edc8eae5 unbound from our chassis
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.144 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8558ef14-bbda-4677-87c6-5cb9edc8eae5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.146 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[d1512083-8060-4989-8ccc-e3380b9ce56e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.146 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5 namespace which is not needed anymore
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct 07 22:22:59 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Consumed 2.797s CPU time.
Oct 07 22:22:59 compute-0 systemd-machined[152719]: Machine qemu-24-instance-0000001e terminated.
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.278 2 DEBUG nova.compute.manager [req-76ffec38-d40e-415d-9c82-a0f506df9106 req-a737e9e4-d08c-423c-8514-620ce4f54a4d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Received event network-vif-unplugged-ef354981-8b70-4bdd-ae10-8e0958d99587 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.280 2 DEBUG oslo_concurrency.lockutils [req-76ffec38-d40e-415d-9c82-a0f506df9106 req-a737e9e4-d08c-423c-8514-620ce4f54a4d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.280 2 DEBUG oslo_concurrency.lockutils [req-76ffec38-d40e-415d-9c82-a0f506df9106 req-a737e9e4-d08c-423c-8514-620ce4f54a4d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.280 2 DEBUG oslo_concurrency.lockutils [req-76ffec38-d40e-415d-9c82-a0f506df9106 req-a737e9e4-d08c-423c-8514-620ce4f54a4d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.281 2 DEBUG nova.compute.manager [req-76ffec38-d40e-415d-9c82-a0f506df9106 req-a737e9e4-d08c-423c-8514-620ce4f54a4d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] No waiting events found dispatching network-vif-unplugged-ef354981-8b70-4bdd-ae10-8e0958d99587 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.281 2 DEBUG nova.compute.manager [req-76ffec38-d40e-415d-9c82-a0f506df9106 req-a737e9e4-d08c-423c-8514-620ce4f54a4d 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Received event network-vif-unplugged-ef354981-8b70-4bdd-ae10-8e0958d99587 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:22:59 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [NOTICE]   (228187) : haproxy version is 3.0.5-8e879a5
Oct 07 22:22:59 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [NOTICE]   (228187) : path to executable is /usr/sbin/haproxy
Oct 07 22:22:59 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [WARNING]  (228187) : Exiting Master process...
Oct 07 22:22:59 compute-0 podman[228571]: 2025-10-07 22:22:59.321826101 +0000 UTC m=+0.042196275 container kill e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 07 22:22:59 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [ALERT]    (228187) : Current worker (228189) exited with code 143 (Terminated)
Oct 07 22:22:59 compute-0 neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5[228183]: [WARNING]  (228187) : All workers exited. Exiting... (0)
Oct 07 22:22:59 compute-0 systemd[1]: libpod-e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988.scope: Deactivated successfully.
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.382 2 INFO nova.virt.libvirt.driver [-] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Instance destroyed successfully.
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.383 2 DEBUG nova.objects.instance [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lazy-loading 'resources' on Instance uuid b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 07 22:22:59 compute-0 podman[228595]: 2025-10-07 22:22:59.406299741 +0000 UTC m=+0.048275730 container died e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 07 22:22:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988-userdata-shm.mount: Deactivated successfully.
Oct 07 22:22:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e99da351bc16b134d315c3fd07d21ff5672ba3fe082522e97ca2ee97d93fedb-merged.mount: Deactivated successfully.
Oct 07 22:22:59 compute-0 podman[228595]: 2025-10-07 22:22:59.45319809 +0000 UTC m=+0.095174079 container cleanup e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest)
Oct 07 22:22:59 compute-0 systemd[1]: libpod-conmon-e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988.scope: Deactivated successfully.
Oct 07 22:22:59 compute-0 podman[228599]: 2025-10-07 22:22:59.475726558 +0000 UTC m=+0.107483663 container remove e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.482 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcc7fad-9227-4852-8dfe-c329bdc3e6af]: (4, ("Tue Oct  7 10:22:59 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5 (e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988)\ne021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988\nTue Oct  7 10:22:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5 (e021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988)\ne021446a9d152fef8aa75cd89b1f7fd81082b521522a152a9fd5f5fe3199b988\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.484 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[4d33c31e-5ac3-42ad-a155-92094e90cd27]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.485 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8558ef14-bbda-4677-87c6-5cb9edc8eae5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.486 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[560971a7-4d43-4fb1-826c-246303c13d16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.486 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8558ef14-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 kernel: tap8558ef14-b0: left promiscuous mode
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.525 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[e0073bc5-ed09-44af-ab0c-85566e78e08d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.563 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[78aa29ca-b205-4f26-ab7c-7cf8bd7b290f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.565 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[366e9d13-4201-496c-a85f-8a66cf20f01d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.591 214116 DEBUG oslo.privsep.daemon [-] privsep: reply[efa1c977-33ed-4612-be28-2fd4e4020616]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545863, 'reachable_time': 17532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228638, 'error': None, 'target': 'ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.595 103905 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8558ef14-bbda-4677-87c6-5cb9edc8eae5 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 07 22:22:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:22:59.596 103905 DEBUG oslo.privsep.daemon [-] privsep: reply[235f73f1-019c-4040-aee8-5bc97cf67fd0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 07 22:22:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d8558ef14\x2dbbda\x2d4677\x2d87c6\x2d5cb9edc8eae5.mount: Deactivated successfully.
Oct 07 22:22:59 compute-0 podman[203153]: time="2025-10-07T22:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:22:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:22:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.891 2 DEBUG nova.virt.libvirt.vif [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-07T22:21:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1708960416',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1708960416',id=30,image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T22:21:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4afdd5a94b604f91a0cbdb5a281ca0c6',ramdisk_id='',reservation_id='r-u0lf1aid',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='c40cab67-7e52-4762-b275-de0efa24bdf4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-14286770',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-14286770-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T22:22:46Z,user_data=None,user_id='f4a3102f6d8e4d53980ce5f605b5e7db',uuid=b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef354981-8b70-4bdd-ae10-8e0958d99587", "address": "fa:16:3e:25:29:e3", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef354981-8b", "ovs_interfaceid": "ef354981-8b70-4bdd-ae10-8e0958d99587", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.892 2 DEBUG nova.network.os_vif_util [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converting VIF {"id": "ef354981-8b70-4bdd-ae10-8e0958d99587", "address": "fa:16:3e:25:29:e3", "network": {"id": "8558ef14-bbda-4677-87c6-5cb9edc8eae5", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182536919-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f08d1843f22b495a995f11f0b1c90ace", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef354981-8b", "ovs_interfaceid": "ef354981-8b70-4bdd-ae10-8e0958d99587", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.893 2 DEBUG nova.network.os_vif_util [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:25:29:e3,bridge_name='br-int',has_traffic_filtering=True,id=ef354981-8b70-4bdd-ae10-8e0958d99587,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef354981-8b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.893 2 DEBUG os_vif [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:29:e3,bridge_name='br-int',has_traffic_filtering=True,id=ef354981-8b70-4bdd-ae10-8e0958d99587,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef354981-8b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef354981-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.903 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5350d1a9-0fbe-4bf4-90d9-b76fc507381a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.909 2 INFO os_vif [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:29:e3,bridge_name='br-int',has_traffic_filtering=True,id=ef354981-8b70-4bdd-ae10-8e0958d99587,network=Network(8558ef14-bbda-4677-87c6-5cb9edc8eae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef354981-8b')
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.910 2 INFO nova.virt.libvirt.driver [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Deleting instance files /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59_del
Oct 07 22:22:59 compute-0 nova_compute[192716]: 2025-10-07 22:22:59.911 2 INFO nova.virt.libvirt.driver [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Deletion of /var/lib/nova/instances/b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59_del complete
Oct 07 22:23:00 compute-0 nova_compute[192716]: 2025-10-07 22:23:00.425 2 INFO nova.compute.manager [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Took 1.34 seconds to destroy the instance on the hypervisor.
Oct 07 22:23:00 compute-0 nova_compute[192716]: 2025-10-07 22:23:00.426 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 07 22:23:00 compute-0 nova_compute[192716]: 2025-10-07 22:23:00.426 2 DEBUG nova.compute.manager [-] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 07 22:23:00 compute-0 nova_compute[192716]: 2025-10-07 22:23:00.427 2 DEBUG nova.network.neutron [-] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 07 22:23:00 compute-0 nova_compute[192716]: 2025-10-07 22:23:00.427 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:23:00 compute-0 nova_compute[192716]: 2025-10-07 22:23:00.653 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.390 2 DEBUG nova.compute.manager [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Received event network-vif-unplugged-ef354981-8b70-4bdd-ae10-8e0958d99587 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.391 2 DEBUG oslo_concurrency.lockutils [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Acquiring lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.391 2 DEBUG oslo_concurrency.lockutils [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.391 2 DEBUG oslo_concurrency.lockutils [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.391 2 DEBUG nova.compute.manager [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] No waiting events found dispatching network-vif-unplugged-ef354981-8b70-4bdd-ae10-8e0958d99587 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.392 2 DEBUG nova.compute.manager [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Received event network-vif-unplugged-ef354981-8b70-4bdd-ae10-8e0958d99587 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.392 2 DEBUG nova.compute.manager [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Received event network-vif-deleted-ef354981-8b70-4bdd-ae10-8e0958d99587 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.392 2 INFO nova.compute.manager [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Neutron deleted interface ef354981-8b70-4bdd-ae10-8e0958d99587; detaching it from the instance and deleting it from the info cache
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.392 2 DEBUG nova.network.neutron [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: ERROR   22:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: ERROR   22:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: ERROR   22:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: ERROR   22:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: ERROR   22:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:23:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.541 2 DEBUG nova.network.neutron [-] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.900 2 DEBUG nova.compute.manager [req-cac65dd6-391a-4ab5-9b9d-b94bbdc44ad9 req-044f6afe-f2cb-4ee2-94c8-69884114e754 4b10b0cea5b14c09bbb86b982363d4ed 3e3454b1c5584c5182164bbb63015c96 - - default default] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Detach interface failed, port_id=ef354981-8b70-4bdd-ae10-8e0958d99587, reason: Instance b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 07 22:23:01 compute-0 nova_compute[192716]: 2025-10-07 22:23:01.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:23:02 compute-0 nova_compute[192716]: 2025-10-07 22:23:02.052 2 INFO nova.compute.manager [-] [instance: b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59] Took 1.63 seconds to deallocate network for instance.
Oct 07 22:23:02 compute-0 nova_compute[192716]: 2025-10-07 22:23:02.570 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:23:02 compute-0 nova_compute[192716]: 2025-10-07 22:23:02.571 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:23:02 compute-0 nova_compute[192716]: 2025-10-07 22:23:02.646 2 DEBUG nova.compute.provider_tree [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:23:03 compute-0 nova_compute[192716]: 2025-10-07 22:23:03.154 2 DEBUG nova.scheduler.client.report [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:23:03 compute-0 nova_compute[192716]: 2025-10-07 22:23:03.666 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:23:03 compute-0 nova_compute[192716]: 2025-10-07 22:23:03.715 2 INFO nova.scheduler.client.report [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Deleted allocations for instance b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59
Oct 07 22:23:03 compute-0 podman[228639]: 2025-10-07 22:23:03.823842924 +0000 UTC m=+0.065417313 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS)
Oct 07 22:23:03 compute-0 podman[228640]: 2025-10-07 22:23:03.863727661 +0000 UTC m=+0.092035149 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 07 22:23:04 compute-0 nova_compute[192716]: 2025-10-07 22:23:04.750 2 DEBUG oslo_concurrency.lockutils [None req-7dd0623b-708f-4660-b87f-b42676a33a36 f4a3102f6d8e4d53980ce5f605b5e7db 4afdd5a94b604f91a0cbdb5a281ca0c6 - - default default] Lock "b07a15b1-b0ee-4a6d-8df3-3ec8afeb9b59" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.195s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:23:04 compute-0 nova_compute[192716]: 2025-10-07 22:23:04.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:06 compute-0 nova_compute[192716]: 2025-10-07 22:23:06.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:06 compute-0 podman[228681]: 2025-10-07 22:23:06.827057079 +0000 UTC m=+0.070093288 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:23:08 compute-0 nova_compute[192716]: 2025-10-07 22:23:08.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:09 compute-0 nova_compute[192716]: 2025-10-07 22:23:09.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:11 compute-0 nova_compute[192716]: 2025-10-07 22:23:11.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:14 compute-0 nova_compute[192716]: 2025-10-07 22:23:14.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:15 compute-0 podman[228707]: 2025-10-07 22:23:15.835358046 +0000 UTC m=+0.066534875 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007)
Oct 07 22:23:15 compute-0 podman[228706]: 2025-10-07 22:23:15.891137681 +0000 UTC m=+0.122055183 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 22:23:16 compute-0 nova_compute[192716]: 2025-10-07 22:23:16.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:19 compute-0 nova_compute[192716]: 2025-10-07 22:23:19.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:21 compute-0 nova_compute[192716]: 2025-10-07 22:23:21.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:21 compute-0 podman[228751]: 2025-10-07 22:23:21.848095247 +0000 UTC m=+0.077794394 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 07 22:23:24 compute-0 nova_compute[192716]: 2025-10-07 22:23:24.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:23:25.667 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:23:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:23:25.668 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:23:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:23:25.668 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:23:26 compute-0 nova_compute[192716]: 2025-10-07 22:23:26.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:29 compute-0 podman[203153]: time="2025-10-07T22:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:23:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:23:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3032 "" "Go-http-client/1.1"
Oct 07 22:23:29 compute-0 nova_compute[192716]: 2025-10-07 22:23:29.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:31 compute-0 nova_compute[192716]: 2025-10-07 22:23:31.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: ERROR   22:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: ERROR   22:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: ERROR   22:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: ERROR   22:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: ERROR   22:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:23:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:23:32 compute-0 sshd-session[228773]: Invalid user testuser from 103.115.24.11 port 60670
Oct 07 22:23:32 compute-0 sshd-session[228773]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:23:32 compute-0 sshd-session[228773]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:23:34 compute-0 podman[228776]: 2025-10-07 22:23:34.869354078 +0000 UTC m=+0.095811117 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 07 22:23:34 compute-0 podman[228777]: 2025-10-07 22:23:34.873983682 +0000 UTC m=+0.092072619 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 22:23:34 compute-0 nova_compute[192716]: 2025-10-07 22:23:34.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:35 compute-0 sshd-session[228773]: Failed password for invalid user testuser from 103.115.24.11 port 60670 ssh2
Oct 07 22:23:35 compute-0 sshd-session[228773]: Received disconnect from 103.115.24.11 port 60670:11: Bye Bye [preauth]
Oct 07 22:23:35 compute-0 sshd-session[228773]: Disconnected from invalid user testuser 103.115.24.11 port 60670 [preauth]
Oct 07 22:23:36 compute-0 nova_compute[192716]: 2025-10-07 22:23:36.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:37 compute-0 podman[228815]: 2025-10-07 22:23:37.855138691 +0000 UTC m=+0.086207399 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:23:39 compute-0 nova_compute[192716]: 2025-10-07 22:23:39.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:41 compute-0 nova_compute[192716]: 2025-10-07 22:23:41.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:41 compute-0 nova_compute[192716]: 2025-10-07 22:23:41.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:23:41 compute-0 nova_compute[192716]: 2025-10-07 22:23:41.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:23:44 compute-0 ovn_controller[94904]: 2025-10-07T22:23:44Z|00282|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Oct 07 22:23:44 compute-0 nova_compute[192716]: 2025-10-07 22:23:44.997 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:23:44 compute-0 nova_compute[192716]: 2025-10-07 22:23:44.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:46 compute-0 nova_compute[192716]: 2025-10-07 22:23:46.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:46 compute-0 podman[228840]: 2025-10-07 22:23:46.815973511 +0000 UTC m=+0.053066538 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:23:46 compute-0 podman[228839]: 2025-10-07 22:23:46.863064566 +0000 UTC m=+0.103467569 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 22:23:50 compute-0 nova_compute[192716]: 2025-10-07 22:23:50.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:51 compute-0 nova_compute[192716]: 2025-10-07 22:23:51.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:52 compute-0 podman[228881]: 2025-10-07 22:23:52.869065899 +0000 UTC m=+0.100679148 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Oct 07 22:23:52 compute-0 nova_compute[192716]: 2025-10-07 22:23:52.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:23:52 compute-0 nova_compute[192716]: 2025-10-07 22:23:52.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.510 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.510 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.511 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.712 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.713 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.736 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.737 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5823MB free_disk=73.29898071289062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.737 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:23:53 compute-0 nova_compute[192716]: 2025-10-07 22:23:53.738 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:23:54 compute-0 nova_compute[192716]: 2025-10-07 22:23:54.791 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:23:54 compute-0 nova_compute[192716]: 2025-10-07 22:23:54.792 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:23:53 up  1:32,  0 user,  load average: 0.22, 0.18, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:23:54 compute-0 nova_compute[192716]: 2025-10-07 22:23:54.815 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:23:55 compute-0 nova_compute[192716]: 2025-10-07 22:23:55.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:55 compute-0 nova_compute[192716]: 2025-10-07 22:23:55.324 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:23:55 compute-0 nova_compute[192716]: 2025-10-07 22:23:55.835 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:23:55 compute-0 nova_compute[192716]: 2025-10-07 22:23:55.835 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:23:56 compute-0 nova_compute[192716]: 2025-10-07 22:23:56.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:23:58 compute-0 nova_compute[192716]: 2025-10-07 22:23:58.836 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:23:59 compute-0 podman[203153]: time="2025-10-07T22:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:23:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:23:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3032 "" "Go-http-client/1.1"
Oct 07 22:24:00 compute-0 nova_compute[192716]: 2025-10-07 22:24:00.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:00 compute-0 nova_compute[192716]: 2025-10-07 22:24:00.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:00 compute-0 nova_compute[192716]: 2025-10-07 22:24:00.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:01 compute-0 nova_compute[192716]: 2025-10-07 22:24:01.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: ERROR   22:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: ERROR   22:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: ERROR   22:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: ERROR   22:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: ERROR   22:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:24:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:24:03 compute-0 nova_compute[192716]: 2025-10-07 22:24:03.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:04 compute-0 nova_compute[192716]: 2025-10-07 22:24:04.496 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:05 compute-0 nova_compute[192716]: 2025-10-07 22:24:05.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:05 compute-0 podman[228905]: 2025-10-07 22:24:05.840152667 +0000 UTC m=+0.071299626 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 07 22:24:05 compute-0 podman[228904]: 2025-10-07 22:24:05.874150382 +0000 UTC m=+0.107989719 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:24:06 compute-0 nova_compute[192716]: 2025-10-07 22:24:06.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:08 compute-0 podman[228945]: 2025-10-07 22:24:08.833320405 +0000 UTC m=+0.067969151 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:24:09 compute-0 sshd-session[228969]: Accepted publickey for zuul from 192.168.122.10 port 59710 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 22:24:09 compute-0 systemd-logind[798]: New session 35 of user zuul.
Oct 07 22:24:09 compute-0 systemd[1]: Started Session 35 of User zuul.
Oct 07 22:24:09 compute-0 sshd-session[228969]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 22:24:09 compute-0 sudo[228973]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 07 22:24:09 compute-0 sudo[228973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 22:24:10 compute-0 nova_compute[192716]: 2025-10-07 22:24:10.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:11 compute-0 nova_compute[192716]: 2025-10-07 22:24:11.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:14 compute-0 ovs-vsctl[229145]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 07 22:24:14 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 228997 (sos)
Oct 07 22:24:14 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 07 22:24:15 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 07 22:24:15 compute-0 nova_compute[192716]: 2025-10-07 22:24:15.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:15 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 07 22:24:15 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 07 22:24:15 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 07 22:24:16 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Oct 07 22:24:16 compute-0 nova_compute[192716]: 2025-10-07 22:24:16.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:16 compute-0 crontab[229569]: (root) LIST (root)
Oct 07 22:24:17 compute-0 podman[229647]: 2025-10-07 22:24:17.882561916 +0000 UTC m=+0.104912771 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007)
Oct 07 22:24:17 compute-0 podman[229646]: 2025-10-07 22:24:17.920133155 +0000 UTC m=+0.138157194 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 22:24:19 compute-0 systemd[1]: Starting Hostname Service...
Oct 07 22:24:19 compute-0 systemd[1]: Started Hostname Service.
Oct 07 22:24:20 compute-0 nova_compute[192716]: 2025-10-07 22:24:20.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:21 compute-0 nova_compute[192716]: 2025-10-07 22:24:21.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:23 compute-0 podman[230194]: 2025-10-07 22:24:23.867692915 +0000 UTC m=+0.104692835 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Oct 07 22:24:25 compute-0 nova_compute[192716]: 2025-10-07 22:24:25.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:25 compute-0 ovs-appctl[230748]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 07 22:24:25 compute-0 ovs-appctl[230759]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 07 22:24:25 compute-0 ovs-appctl[230772]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 07 22:24:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:24:25.672 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:24:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:24:25.673 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:24:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:24:25.673 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:24:26 compute-0 nova_compute[192716]: 2025-10-07 22:24:26.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:29 compute-0 podman[203153]: time="2025-10-07T22:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:24:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:24:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 07 22:24:30 compute-0 nova_compute[192716]: 2025-10-07 22:24:30.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:31 compute-0 nova_compute[192716]: 2025-10-07 22:24:31.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: ERROR   22:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: ERROR   22:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: ERROR   22:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: ERROR   22:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: ERROR   22:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:24:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:24:34 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 07 22:24:35 compute-0 nova_compute[192716]: 2025-10-07 22:24:35.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:36 compute-0 podman[232135]: 2025-10-07 22:24:36.008074323 +0000 UTC m=+0.112595194 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:24:36 compute-0 podman[232139]: 2025-10-07 22:24:36.019967807 +0000 UTC m=+0.093787528 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct 07 22:24:36 compute-0 nova_compute[192716]: 2025-10-07 22:24:36.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:36 compute-0 systemd[1]: Starting Time & Date Service...
Oct 07 22:24:36 compute-0 systemd[1]: Started Time & Date Service.
Oct 07 22:24:38 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 07 22:24:39 compute-0 podman[232211]: 2025-10-07 22:24:39.869955821 +0000 UTC m=+0.100623966 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:24:40 compute-0 nova_compute[192716]: 2025-10-07 22:24:40.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:41 compute-0 nova_compute[192716]: 2025-10-07 22:24:41.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:41 compute-0 nova_compute[192716]: 2025-10-07 22:24:41.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:41 compute-0 nova_compute[192716]: 2025-10-07 22:24:41.992 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:24:44 compute-0 nova_compute[192716]: 2025-10-07 22:24:44.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:45 compute-0 nova_compute[192716]: 2025-10-07 22:24:45.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:46 compute-0 nova_compute[192716]: 2025-10-07 22:24:46.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:48 compute-0 podman[232236]: 2025-10-07 22:24:48.879392889 +0000 UTC m=+0.094005175 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 07 22:24:48 compute-0 podman[232235]: 2025-10-07 22:24:48.921745296 +0000 UTC m=+0.136265629 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 07 22:24:50 compute-0 nova_compute[192716]: 2025-10-07 22:24:50.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:51 compute-0 nova_compute[192716]: 2025-10-07 22:24:51.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:52 compute-0 nova_compute[192716]: 2025-10-07 22:24:52.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:54 compute-0 podman[232278]: 2025-10-07 22:24:54.852912042 +0000 UTC m=+0.086826717 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350)
Oct 07 22:24:54 compute-0 nova_compute[192716]: 2025-10-07 22:24:54.994 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.669 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.670 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.670 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.670 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.804 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.806 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.827 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.828 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5563MB free_disk=73.02627563476562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.828 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:24:55 compute-0 nova_compute[192716]: 2025-10-07 22:24:55.829 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:24:56 compute-0 nova_compute[192716]: 2025-10-07 22:24:56.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:24:56 compute-0 nova_compute[192716]: 2025-10-07 22:24:56.905 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:24:56 compute-0 nova_compute[192716]: 2025-10-07 22:24:56.906 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:24:55 up  1:33,  0 user,  load average: 0.95, 0.39, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:24:57 compute-0 nova_compute[192716]: 2025-10-07 22:24:57.056 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:24:57 compute-0 sudo[228973]: pam_unix(sudo:session): session closed for user root
Oct 07 22:24:57 compute-0 sshd-session[228972]: Received disconnect from 192.168.122.10 port 59710:11: disconnected by user
Oct 07 22:24:57 compute-0 nova_compute[192716]: 2025-10-07 22:24:57.567 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:24:57 compute-0 sshd-session[228972]: Disconnected from user zuul 192.168.122.10 port 59710
Oct 07 22:24:57 compute-0 sshd-session[228969]: pam_unix(sshd:session): session closed for user zuul
Oct 07 22:24:57 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Oct 07 22:24:57 compute-0 systemd-logind[798]: Session 35 logged out. Waiting for processes to exit.
Oct 07 22:24:57 compute-0 systemd[1]: session-35.scope: Consumed 1min 20.647s CPU time, 518.5M memory peak, read 110.1M from disk, written 28.1M to disk.
Oct 07 22:24:57 compute-0 systemd-logind[798]: Removed session 35.
Oct 07 22:24:57 compute-0 sshd-session[232302]: Accepted publickey for zuul from 192.168.122.10 port 54420 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 22:24:57 compute-0 systemd-logind[798]: New session 36 of user zuul.
Oct 07 22:24:57 compute-0 systemd[1]: Started Session 36 of User zuul.
Oct 07 22:24:57 compute-0 sshd-session[232302]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 22:24:57 compute-0 sudo[232306]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-10-07-guupfxe.tar.xz
Oct 07 22:24:57 compute-0 sudo[232306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 22:24:58 compute-0 sudo[232306]: pam_unix(sudo:session): session closed for user root
Oct 07 22:24:58 compute-0 sshd-session[232305]: Received disconnect from 192.168.122.10 port 54420:11: disconnected by user
Oct 07 22:24:58 compute-0 sshd-session[232305]: Disconnected from user zuul 192.168.122.10 port 54420
Oct 07 22:24:58 compute-0 sshd-session[232302]: pam_unix(sshd:session): session closed for user zuul
Oct 07 22:24:58 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Oct 07 22:24:58 compute-0 systemd-logind[798]: Session 36 logged out. Waiting for processes to exit.
Oct 07 22:24:58 compute-0 systemd-logind[798]: Removed session 36.
Oct 07 22:24:58 compute-0 nova_compute[192716]: 2025-10-07 22:24:58.080 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:24:58 compute-0 nova_compute[192716]: 2025-10-07 22:24:58.081 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.252s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:24:58 compute-0 sshd-session[232331]: Accepted publickey for zuul from 192.168.122.10 port 54432 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 22:24:58 compute-0 systemd-logind[798]: New session 37 of user zuul.
Oct 07 22:24:58 compute-0 systemd[1]: Started Session 37 of User zuul.
Oct 07 22:24:58 compute-0 sshd-session[232331]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 22:24:58 compute-0 sudo[232335]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Oct 07 22:24:58 compute-0 sudo[232335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 22:24:58 compute-0 sudo[232335]: pam_unix(sudo:session): session closed for user root
Oct 07 22:24:58 compute-0 sshd-session[232334]: Received disconnect from 192.168.122.10 port 54432:11: disconnected by user
Oct 07 22:24:58 compute-0 sshd-session[232334]: Disconnected from user zuul 192.168.122.10 port 54432
Oct 07 22:24:58 compute-0 sshd-session[232331]: pam_unix(sshd:session): session closed for user zuul
Oct 07 22:24:58 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Oct 07 22:24:58 compute-0 systemd-logind[798]: Session 37 logged out. Waiting for processes to exit.
Oct 07 22:24:58 compute-0 systemd-logind[798]: Removed session 37.
Oct 07 22:24:59 compute-0 nova_compute[192716]: 2025-10-07 22:24:59.077 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:24:59 compute-0 podman[203153]: time="2025-10-07T22:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:24:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:24:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 07 22:25:00 compute-0 nova_compute[192716]: 2025-10-07 22:25:00.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:01 compute-0 nova_compute[192716]: 2025-10-07 22:25:01.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: ERROR   22:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: ERROR   22:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: ERROR   22:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: ERROR   22:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: ERROR   22:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:25:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:25:02 compute-0 nova_compute[192716]: 2025-10-07 22:25:02.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:02 compute-0 nova_compute[192716]: 2025-10-07 22:25:02.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:05 compute-0 nova_compute[192716]: 2025-10-07 22:25:05.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:05 compute-0 nova_compute[192716]: 2025-10-07 22:25:05.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:06 compute-0 nova_compute[192716]: 2025-10-07 22:25:06.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:06 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 07 22:25:06 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 07 22:25:06 compute-0 podman[232361]: 2025-10-07 22:25:06.671934388 +0000 UTC m=+0.091574104 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:25:06 compute-0 podman[232360]: 2025-10-07 22:25:06.683200294 +0000 UTC m=+0.104566820 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:25:10 compute-0 nova_compute[192716]: 2025-10-07 22:25:10.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:10 compute-0 podman[232402]: 2025-10-07 22:25:10.831138751 +0000 UTC m=+0.066633571 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:25:11 compute-0 nova_compute[192716]: 2025-10-07 22:25:11.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:15 compute-0 nova_compute[192716]: 2025-10-07 22:25:15.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:16 compute-0 nova_compute[192716]: 2025-10-07 22:25:16.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:19 compute-0 podman[232428]: 2025-10-07 22:25:19.820977704 +0000 UTC m=+0.057453295 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:25:19 compute-0 podman[232427]: 2025-10-07 22:25:19.890031894 +0000 UTC m=+0.124735674 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Oct 07 22:25:20 compute-0 nova_compute[192716]: 2025-10-07 22:25:20.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:21 compute-0 nova_compute[192716]: 2025-10-07 22:25:21.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:25 compute-0 nova_compute[192716]: 2025-10-07 22:25:25.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:25:25.674 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:25:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:25:25.675 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:25:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:25:25.675 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:25:25 compute-0 podman[232473]: 2025-10-07 22:25:25.830937052 +0000 UTC m=+0.067530827 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64)
Oct 07 22:25:26 compute-0 nova_compute[192716]: 2025-10-07 22:25:26.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:29 compute-0 podman[203153]: time="2025-10-07T22:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:25:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:25:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Oct 07 22:25:30 compute-0 nova_compute[192716]: 2025-10-07 22:25:30.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:31 compute-0 nova_compute[192716]: 2025-10-07 22:25:31.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: ERROR   22:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: ERROR   22:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: ERROR   22:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: ERROR   22:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: ERROR   22:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:25:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:25:35 compute-0 nova_compute[192716]: 2025-10-07 22:25:35.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:36 compute-0 nova_compute[192716]: 2025-10-07 22:25:36.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:36 compute-0 podman[232495]: 2025-10-07 22:25:36.845013463 +0000 UTC m=+0.071077720 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 07 22:25:36 compute-0 podman[232494]: 2025-10-07 22:25:36.870035808 +0000 UTC m=+0.096693753 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct 07 22:25:40 compute-0 nova_compute[192716]: 2025-10-07 22:25:40.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:41 compute-0 nova_compute[192716]: 2025-10-07 22:25:41.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:41 compute-0 podman[232533]: 2025-10-07 22:25:41.844776031 +0000 UTC m=+0.078536276 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:25:43 compute-0 nova_compute[192716]: 2025-10-07 22:25:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:43 compute-0 nova_compute[192716]: 2025-10-07 22:25:43.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:25:45 compute-0 nova_compute[192716]: 2025-10-07 22:25:45.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:45 compute-0 nova_compute[192716]: 2025-10-07 22:25:45.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:46 compute-0 nova_compute[192716]: 2025-10-07 22:25:46.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:47 compute-0 unix_chkpwd[232559]: password check failed for user (root)
Oct 07 22:25:47 compute-0 sshd-session[232557]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 07 22:25:49 compute-0 sshd-session[232557]: Failed password for root from 193.46.255.33 port 49048 ssh2
Oct 07 22:25:49 compute-0 unix_chkpwd[232560]: password check failed for user (root)
Oct 07 22:25:50 compute-0 nova_compute[192716]: 2025-10-07 22:25:50.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:50 compute-0 podman[232562]: 2025-10-07 22:25:50.884730765 +0000 UTC m=+0.110553995 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 07 22:25:50 compute-0 podman[232561]: 2025-10-07 22:25:50.894297982 +0000 UTC m=+0.135523708 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct 07 22:25:51 compute-0 sshd-session[232557]: Failed password for root from 193.46.255.33 port 49048 ssh2
Oct 07 22:25:51 compute-0 nova_compute[192716]: 2025-10-07 22:25:51.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:51 compute-0 unix_chkpwd[232606]: password check failed for user (root)
Oct 07 22:25:53 compute-0 sshd-session[232557]: Failed password for root from 193.46.255.33 port 49048 ssh2
Oct 07 22:25:53 compute-0 nova_compute[192716]: 2025-10-07 22:25:53.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:55 compute-0 nova_compute[192716]: 2025-10-07 22:25:55.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:55 compute-0 sshd-session[232557]: Received disconnect from 193.46.255.33 port 49048:11:  [preauth]
Oct 07 22:25:55 compute-0 sshd-session[232557]: Disconnected from authenticating user root 193.46.255.33 port 49048 [preauth]
Oct 07 22:25:55 compute-0 sshd-session[232557]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 07 22:25:55 compute-0 nova_compute[192716]: 2025-10-07 22:25:55.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.508 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.509 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.509 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.695 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.696 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.730 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.731 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5787MB free_disk=73.29865264892578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.731 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:25:56 compute-0 nova_compute[192716]: 2025-10-07 22:25:56.732 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:25:56 compute-0 unix_chkpwd[232627]: password check failed for user (root)
Oct 07 22:25:56 compute-0 sshd-session[232607]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 07 22:25:56 compute-0 podman[232610]: 2025-10-07 22:25:56.838478464 +0000 UTC m=+0.067282411 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:25:57 compute-0 nova_compute[192716]: 2025-10-07 22:25:57.924 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:25:57 compute-0 nova_compute[192716]: 2025-10-07 22:25:57.924 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:25:56 up  1:34,  0 user,  load average: 0.38, 0.33, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:25:57 compute-0 nova_compute[192716]: 2025-10-07 22:25:57.949 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:25:58 compute-0 sshd-session[232607]: Failed password for root from 193.46.255.33 port 45804 ssh2
Oct 07 22:25:58 compute-0 nova_compute[192716]: 2025-10-07 22:25:58.456 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:25:58 compute-0 nova_compute[192716]: 2025-10-07 22:25:58.967 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:25:58 compute-0 nova_compute[192716]: 2025-10-07 22:25:58.968 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.236s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:25:58 compute-0 unix_chkpwd[232632]: password check failed for user (root)
Oct 07 22:25:59 compute-0 podman[203153]: time="2025-10-07T22:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:25:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:25:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 07 22:25:59 compute-0 nova_compute[192716]: 2025-10-07 22:25:59.968 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:00 compute-0 nova_compute[192716]: 2025-10-07 22:26:00.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: ERROR   22:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: ERROR   22:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: ERROR   22:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: ERROR   22:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: ERROR   22:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:26:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:26:01 compute-0 nova_compute[192716]: 2025-10-07 22:26:01.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:01 compute-0 sshd-session[232607]: Failed password for root from 193.46.255.33 port 45804 ssh2
Oct 07 22:26:02 compute-0 nova_compute[192716]: 2025-10-07 22:26:02.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:03 compute-0 unix_chkpwd[232633]: password check failed for user (root)
Oct 07 22:26:03 compute-0 nova_compute[192716]: 2025-10-07 22:26:03.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:05 compute-0 nova_compute[192716]: 2025-10-07 22:26:05.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:05 compute-0 sshd-session[232607]: Failed password for root from 193.46.255.33 port 45804 ssh2
Oct 07 22:26:05 compute-0 nova_compute[192716]: 2025-10-07 22:26:05.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:06 compute-0 nova_compute[192716]: 2025-10-07 22:26:06.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:06 compute-0 nova_compute[192716]: 2025-10-07 22:26:06.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:07 compute-0 sshd-session[232607]: Received disconnect from 193.46.255.33 port 45804:11:  [preauth]
Oct 07 22:26:07 compute-0 sshd-session[232607]: Disconnected from authenticating user root 193.46.255.33 port 45804 [preauth]
Oct 07 22:26:07 compute-0 sshd-session[232607]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 07 22:26:07 compute-0 podman[232636]: 2025-10-07 22:26:07.822961018 +0000 UTC m=+0.059219076 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 07 22:26:07 compute-0 podman[232637]: 2025-10-07 22:26:07.828911241 +0000 UTC m=+0.061547484 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 22:26:08 compute-0 unix_chkpwd[232674]: password check failed for user (root)
Oct 07 22:26:08 compute-0 sshd-session[232634]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 07 22:26:09 compute-0 sshd-session[232634]: Failed password for root from 193.46.255.33 port 14288 ssh2
Oct 07 22:26:10 compute-0 nova_compute[192716]: 2025-10-07 22:26:10.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:10 compute-0 unix_chkpwd[232675]: password check failed for user (root)
Oct 07 22:26:11 compute-0 nova_compute[192716]: 2025-10-07 22:26:11.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:12 compute-0 podman[232676]: 2025-10-07 22:26:12.805940169 +0000 UTC m=+0.049978448 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:26:13 compute-0 sshd-session[232634]: Failed password for root from 193.46.255.33 port 14288 ssh2
Oct 07 22:26:14 compute-0 unix_chkpwd[232700]: password check failed for user (root)
Oct 07 22:26:15 compute-0 nova_compute[192716]: 2025-10-07 22:26:15.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:16 compute-0 nova_compute[192716]: 2025-10-07 22:26:16.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:16 compute-0 sshd-session[232634]: Failed password for root from 193.46.255.33 port 14288 ssh2
Oct 07 22:26:18 compute-0 sshd-session[232634]: Received disconnect from 193.46.255.33 port 14288:11:  [preauth]
Oct 07 22:26:18 compute-0 sshd-session[232634]: Disconnected from authenticating user root 193.46.255.33 port 14288 [preauth]
Oct 07 22:26:18 compute-0 sshd-session[232634]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 07 22:26:20 compute-0 nova_compute[192716]: 2025-10-07 22:26:20.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:21 compute-0 nova_compute[192716]: 2025-10-07 22:26:21.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:21 compute-0 podman[232703]: 2025-10-07 22:26:21.845761798 +0000 UTC m=+0.076319022 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 22:26:21 compute-0 podman[232702]: 2025-10-07 22:26:21.973989293 +0000 UTC m=+0.205996459 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct 07 22:26:25 compute-0 nova_compute[192716]: 2025-10-07 22:26:25.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:26:25.678 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:26:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:26:25.679 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:26:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:26:25.679 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:26:26 compute-0 nova_compute[192716]: 2025-10-07 22:26:26.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:27 compute-0 podman[232749]: 2025-10-07 22:26:27.833758672 +0000 UTC m=+0.066272651 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6)
Oct 07 22:26:29 compute-0 podman[203153]: time="2025-10-07T22:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:26:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:26:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 07 22:26:30 compute-0 nova_compute[192716]: 2025-10-07 22:26:30.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: ERROR   22:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: ERROR   22:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: ERROR   22:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: ERROR   22:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: ERROR   22:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:26:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:26:31 compute-0 nova_compute[192716]: 2025-10-07 22:26:31.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:35 compute-0 nova_compute[192716]: 2025-10-07 22:26:35.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:36 compute-0 nova_compute[192716]: 2025-10-07 22:26:36.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:38 compute-0 podman[232771]: 2025-10-07 22:26:38.860186102 +0000 UTC m=+0.087998810 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true)
Oct 07 22:26:38 compute-0 podman[232770]: 2025-10-07 22:26:38.86048377 +0000 UTC m=+0.094716604 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 07 22:26:40 compute-0 nova_compute[192716]: 2025-10-07 22:26:40.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:41 compute-0 nova_compute[192716]: 2025-10-07 22:26:41.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:43 compute-0 podman[232808]: 2025-10-07 22:26:43.866557761 +0000 UTC m=+0.094731616 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:26:43 compute-0 nova_compute[192716]: 2025-10-07 22:26:43.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:43 compute-0 nova_compute[192716]: 2025-10-07 22:26:43.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:26:45 compute-0 nova_compute[192716]: 2025-10-07 22:26:45.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:45 compute-0 nova_compute[192716]: 2025-10-07 22:26:45.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:46 compute-0 nova_compute[192716]: 2025-10-07 22:26:46.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:48 compute-0 nova_compute[192716]: 2025-10-07 22:26:48.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:50 compute-0 nova_compute[192716]: 2025-10-07 22:26:50.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:51 compute-0 nova_compute[192716]: 2025-10-07 22:26:51.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:52 compute-0 podman[232833]: 2025-10-07 22:26:52.833827719 +0000 UTC m=+0.068541317 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 07 22:26:52 compute-0 podman[232832]: 2025-10-07 22:26:52.923570499 +0000 UTC m=+0.159795161 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 22:26:55 compute-0 nova_compute[192716]: 2025-10-07 22:26:55.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:56 compute-0 nova_compute[192716]: 2025-10-07 22:26:56.492 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:56 compute-0 nova_compute[192716]: 2025-10-07 22:26:56.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:26:56 compute-0 nova_compute[192716]: 2025-10-07 22:26:56.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.513 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.514 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.514 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.515 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.655 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.656 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.690 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.691 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5779MB free_disk=73.29864883422852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.691 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:26:57 compute-0 nova_compute[192716]: 2025-10-07 22:26:57.692 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.759 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.759 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:26:57 up  1:35,  0 user,  load average: 0.21, 0.28, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.777 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing inventories for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.798 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating ProviderTree inventory for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.798 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Updating inventory in ProviderTree for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.831 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing aggregate associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 07 22:26:58 compute-0 podman[232879]: 2025-10-07 22:26:58.842076847 +0000 UTC m=+0.075912460 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.866 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Refreshing trait associations for resource provider 19d1aa8e-e3fb-43ab-9849-122569e48a32, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_F16C,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 07 22:26:58 compute-0 nova_compute[192716]: 2025-10-07 22:26:58.891 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:26:59 compute-0 nova_compute[192716]: 2025-10-07 22:26:59.397 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:26:59 compute-0 podman[203153]: time="2025-10-07T22:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:26:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:26:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3026 "" "Go-http-client/1.1"
Oct 07 22:26:59 compute-0 nova_compute[192716]: 2025-10-07 22:26:59.906 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:26:59 compute-0 nova_compute[192716]: 2025-10-07 22:26:59.907 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.215s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:27:00 compute-0 nova_compute[192716]: 2025-10-07 22:27:00.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:00 compute-0 nova_compute[192716]: 2025-10-07 22:27:00.907 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: ERROR   22:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: ERROR   22:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: ERROR   22:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: ERROR   22:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: ERROR   22:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:27:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:27:01 compute-0 nova_compute[192716]: 2025-10-07 22:27:01.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:03 compute-0 nova_compute[192716]: 2025-10-07 22:27:03.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:04 compute-0 nova_compute[192716]: 2025-10-07 22:27:04.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:05 compute-0 nova_compute[192716]: 2025-10-07 22:27:05.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:06 compute-0 nova_compute[192716]: 2025-10-07 22:27:06.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:06 compute-0 nova_compute[192716]: 2025-10-07 22:27:06.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:08 compute-0 nova_compute[192716]: 2025-10-07 22:27:08.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:08 compute-0 nova_compute[192716]: 2025-10-07 22:27:08.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 07 22:27:09 compute-0 nova_compute[192716]: 2025-10-07 22:27:09.503 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 07 22:27:09 compute-0 podman[232901]: 2025-10-07 22:27:09.841733953 +0000 UTC m=+0.080115043 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 07 22:27:09 compute-0 podman[232902]: 2025-10-07 22:27:09.851027882 +0000 UTC m=+0.075240061 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 07 22:27:10 compute-0 nova_compute[192716]: 2025-10-07 22:27:10.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:11 compute-0 nova_compute[192716]: 2025-10-07 22:27:11.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:14 compute-0 podman[232940]: 2025-10-07 22:27:14.838402072 +0000 UTC m=+0.075182080 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:27:15 compute-0 nova_compute[192716]: 2025-10-07 22:27:15.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:16 compute-0 nova_compute[192716]: 2025-10-07 22:27:16.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:20 compute-0 nova_compute[192716]: 2025-10-07 22:27:20.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:20 compute-0 nova_compute[192716]: 2025-10-07 22:27:20.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:20 compute-0 nova_compute[192716]: 2025-10-07 22:27:20.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 07 22:27:21 compute-0 nova_compute[192716]: 2025-10-07 22:27:21.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:22 compute-0 sshd-session[232966]: Invalid user jenkins from 103.115.24.11 port 45126
Oct 07 22:27:22 compute-0 sshd-session[232966]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:27:22 compute-0 sshd-session[232966]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:27:23 compute-0 podman[232969]: 2025-10-07 22:27:23.056154332 +0000 UTC m=+0.052244045 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:27:23 compute-0 podman[232968]: 2025-10-07 22:27:23.095004777 +0000 UTC m=+0.094522109 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 07 22:27:25 compute-0 sshd-session[232966]: Failed password for invalid user jenkins from 103.115.24.11 port 45126 ssh2
Oct 07 22:27:25 compute-0 nova_compute[192716]: 2025-10-07 22:27:25.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:27:25.680 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:27:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:27:25.681 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:27:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:27:25.681 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:27:26 compute-0 nova_compute[192716]: 2025-10-07 22:27:26.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:27 compute-0 sshd-session[232966]: Received disconnect from 103.115.24.11 port 45126:11: Bye Bye [preauth]
Oct 07 22:27:27 compute-0 sshd-session[232966]: Disconnected from invalid user jenkins 103.115.24.11 port 45126 [preauth]
Oct 07 22:27:29 compute-0 podman[203153]: time="2025-10-07T22:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:27:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:27:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 07 22:27:29 compute-0 podman[233015]: 2025-10-07 22:27:29.839565661 +0000 UTC m=+0.075106127 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 07 22:27:30 compute-0 nova_compute[192716]: 2025-10-07 22:27:30.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: ERROR   22:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: ERROR   22:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: ERROR   22:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: ERROR   22:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: ERROR   22:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:27:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:27:31 compute-0 nova_compute[192716]: 2025-10-07 22:27:31.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:35 compute-0 nova_compute[192716]: 2025-10-07 22:27:35.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:36 compute-0 nova_compute[192716]: 2025-10-07 22:27:36.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:40 compute-0 nova_compute[192716]: 2025-10-07 22:27:40.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:40 compute-0 podman[233038]: 2025-10-07 22:27:40.81972017 +0000 UTC m=+0.053972855 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 07 22:27:40 compute-0 podman[233037]: 2025-10-07 22:27:40.822294984 +0000 UTC m=+0.060132173 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 22:27:41 compute-0 nova_compute[192716]: 2025-10-07 22:27:41.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:44 compute-0 nova_compute[192716]: 2025-10-07 22:27:44.498 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:44 compute-0 nova_compute[192716]: 2025-10-07 22:27:44.498 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:27:45 compute-0 nova_compute[192716]: 2025-10-07 22:27:45.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:45 compute-0 podman[233076]: 2025-10-07 22:27:45.832938578 +0000 UTC m=+0.066755465 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:27:46 compute-0 nova_compute[192716]: 2025-10-07 22:27:46.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:47 compute-0 nova_compute[192716]: 2025-10-07 22:27:47.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:50 compute-0 nova_compute[192716]: 2025-10-07 22:27:50.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:51 compute-0 nova_compute[192716]: 2025-10-07 22:27:51.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:53 compute-0 podman[233101]: 2025-10-07 22:27:53.828803888 +0000 UTC m=+0.064143890 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 07 22:27:53 compute-0 podman[233100]: 2025-10-07 22:27:53.85682811 +0000 UTC m=+0.097750474 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 07 22:27:55 compute-0 nova_compute[192716]: 2025-10-07 22:27:55.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:56 compute-0 nova_compute[192716]: 2025-10-07 22:27:56.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:27:56 compute-0 nova_compute[192716]: 2025-10-07 22:27:56.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:57 compute-0 nova_compute[192716]: 2025-10-07 22:27:57.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.510 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.511 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.649 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.650 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.666 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.667 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5816MB free_disk=73.29862976074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.667 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:27:58 compute-0 nova_compute[192716]: 2025-10-07 22:27:58.668 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:27:59 compute-0 podman[203153]: time="2025-10-07T22:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:27:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:27:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 07 22:27:59 compute-0 nova_compute[192716]: 2025-10-07 22:27:59.825 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:27:59 compute-0 nova_compute[192716]: 2025-10-07 22:27:59.826 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:27:58 up  1:36,  0 user,  load average: 0.07, 0.23, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:27:59 compute-0 nova_compute[192716]: 2025-10-07 22:27:59.846 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:28:00 compute-0 nova_compute[192716]: 2025-10-07 22:28:00.367 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:28:00 compute-0 nova_compute[192716]: 2025-10-07 22:28:00.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:00 compute-0 podman[233146]: 2025-10-07 22:28:00.826058713 +0000 UTC m=+0.064296863 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Oct 07 22:28:00 compute-0 nova_compute[192716]: 2025-10-07 22:28:00.920 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:28:00 compute-0 nova_compute[192716]: 2025-10-07 22:28:00.920 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.252s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: ERROR   22:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: ERROR   22:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: ERROR   22:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: ERROR   22:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: ERROR   22:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:28:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:28:01 compute-0 nova_compute[192716]: 2025-10-07 22:28:01.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:02 compute-0 nova_compute[192716]: 2025-10-07 22:28:02.920 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:04 compute-0 nova_compute[192716]: 2025-10-07 22:28:04.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:04 compute-0 nova_compute[192716]: 2025-10-07 22:28:04.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:05 compute-0 nova_compute[192716]: 2025-10-07 22:28:05.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:06 compute-0 nova_compute[192716]: 2025-10-07 22:28:06.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:07 compute-0 nova_compute[192716]: 2025-10-07 22:28:07.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:10 compute-0 nova_compute[192716]: 2025-10-07 22:28:10.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:10 compute-0 nova_compute[192716]: 2025-10-07 22:28:10.985 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:11 compute-0 nova_compute[192716]: 2025-10-07 22:28:11.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:11 compute-0 podman[233167]: 2025-10-07 22:28:11.834599076 +0000 UTC m=+0.071404040 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS)
Oct 07 22:28:11 compute-0 podman[233168]: 2025-10-07 22:28:11.852556066 +0000 UTC m=+0.077719353 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 07 22:28:15 compute-0 nova_compute[192716]: 2025-10-07 22:28:15.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:16 compute-0 nova_compute[192716]: 2025-10-07 22:28:16.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:16 compute-0 podman[233208]: 2025-10-07 22:28:16.838799922 +0000 UTC m=+0.082732708 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:28:20 compute-0 nova_compute[192716]: 2025-10-07 22:28:20.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:21 compute-0 nova_compute[192716]: 2025-10-07 22:28:21.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:24 compute-0 podman[233234]: 2025-10-07 22:28:24.812996914 +0000 UTC m=+0.049697291 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:28:24 compute-0 podman[233233]: 2025-10-07 22:28:24.851012595 +0000 UTC m=+0.087795355 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 07 22:28:25 compute-0 nova_compute[192716]: 2025-10-07 22:28:25.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:28:25.682 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:28:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:28:25.682 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:28:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:28:25.683 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:28:26 compute-0 nova_compute[192716]: 2025-10-07 22:28:26.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:29 compute-0 podman[203153]: time="2025-10-07T22:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:28:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:28:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 07 22:28:30 compute-0 nova_compute[192716]: 2025-10-07 22:28:30.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: ERROR   22:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: ERROR   22:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: ERROR   22:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: ERROR   22:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: ERROR   22:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:28:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:28:31 compute-0 nova_compute[192716]: 2025-10-07 22:28:31.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:31 compute-0 podman[233280]: 2025-10-07 22:28:31.824985909 +0000 UTC m=+0.065392896 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 07 22:28:35 compute-0 nova_compute[192716]: 2025-10-07 22:28:35.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:36 compute-0 nova_compute[192716]: 2025-10-07 22:28:36.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:40 compute-0 nova_compute[192716]: 2025-10-07 22:28:40.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:41 compute-0 nova_compute[192716]: 2025-10-07 22:28:41.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:42 compute-0 podman[233302]: 2025-10-07 22:28:42.823401228 +0000 UTC m=+0.064465278 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 07 22:28:42 compute-0 podman[233303]: 2025-10-07 22:28:42.823855561 +0000 UTC m=+0.063070248 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 22:28:45 compute-0 nova_compute[192716]: 2025-10-07 22:28:45.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:45 compute-0 nova_compute[192716]: 2025-10-07 22:28:45.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:45 compute-0 nova_compute[192716]: 2025-10-07 22:28:45.990 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:28:46 compute-0 nova_compute[192716]: 2025-10-07 22:28:46.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:47 compute-0 podman[233341]: 2025-10-07 22:28:47.807537091 +0000 UTC m=+0.052484442 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:28:47 compute-0 nova_compute[192716]: 2025-10-07 22:28:47.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:50 compute-0 nova_compute[192716]: 2025-10-07 22:28:50.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:51 compute-0 nova_compute[192716]: 2025-10-07 22:28:51.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:55 compute-0 nova_compute[192716]: 2025-10-07 22:28:55.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:55 compute-0 podman[233366]: 2025-10-07 22:28:55.816411447 +0000 UTC m=+0.053950744 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 07 22:28:55 compute-0 podman[233365]: 2025-10-07 22:28:55.838901569 +0000 UTC m=+0.079957118 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:28:56 compute-0 nova_compute[192716]: 2025-10-07 22:28:56.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:58 compute-0 nova_compute[192716]: 2025-10-07 22:28:58.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:58 compute-0 nova_compute[192716]: 2025-10-07 22:28:58.989 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:28:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:28:59.034 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:e1:cc', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '62:84:ba:a4:1b:31'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 07 22:28:59 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:28:59.035 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.605 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.605 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.605 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.605 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.741 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.742 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:28:59 compute-0 podman[203153]: time="2025-10-07T22:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:28:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:28:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.764 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.765 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.29875183105469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.765 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:28:59 compute-0 nova_compute[192716]: 2025-10-07 22:28:59.765 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:29:00 compute-0 nova_compute[192716]: 2025-10-07 22:29:00.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:00 compute-0 nova_compute[192716]: 2025-10-07 22:29:00.824 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:29:00 compute-0 nova_compute[192716]: 2025-10-07 22:29:00.824 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:28:59 up  1:37,  0 user,  load average: 0.02, 0.18, 0.19\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:29:00 compute-0 nova_compute[192716]: 2025-10-07 22:29:00.870 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:29:01 compute-0 nova_compute[192716]: 2025-10-07 22:29:01.405 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: ERROR   22:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: ERROR   22:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: ERROR   22:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: ERROR   22:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: ERROR   22:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:29:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:29:01 compute-0 nova_compute[192716]: 2025-10-07 22:29:01.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:01 compute-0 nova_compute[192716]: 2025-10-07 22:29:01.920 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:29:01 compute-0 nova_compute[192716]: 2025-10-07 22:29:01.920 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:29:02 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:29:02.036 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dca786dc-b408-4181-8e47-0e14c60f13da, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 07 22:29:02 compute-0 podman[233413]: 2025-10-07 22:29:02.804223229 +0000 UTC m=+0.048804035 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64)
Oct 07 22:29:04 compute-0 nova_compute[192716]: 2025-10-07 22:29:04.922 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:04 compute-0 nova_compute[192716]: 2025-10-07 22:29:04.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:05 compute-0 nova_compute[192716]: 2025-10-07 22:29:05.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:06 compute-0 nova_compute[192716]: 2025-10-07 22:29:06.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:06 compute-0 nova_compute[192716]: 2025-10-07 22:29:06.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:07 compute-0 nova_compute[192716]: 2025-10-07 22:29:07.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:10 compute-0 nova_compute[192716]: 2025-10-07 22:29:10.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:11 compute-0 nova_compute[192716]: 2025-10-07 22:29:11.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:13 compute-0 podman[233435]: 2025-10-07 22:29:13.825019208 +0000 UTC m=+0.058826305 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007)
Oct 07 22:29:13 compute-0 podman[233436]: 2025-10-07 22:29:13.83197898 +0000 UTC m=+0.061693118 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 07 22:29:15 compute-0 nova_compute[192716]: 2025-10-07 22:29:15.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:16 compute-0 nova_compute[192716]: 2025-10-07 22:29:16.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:18 compute-0 podman[233475]: 2025-10-07 22:29:18.827815053 +0000 UTC m=+0.061326648 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:29:20 compute-0 nova_compute[192716]: 2025-10-07 22:29:20.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:21 compute-0 nova_compute[192716]: 2025-10-07 22:29:21.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:25 compute-0 nova_compute[192716]: 2025-10-07 22:29:25.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:29:25.684 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:29:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:29:25.684 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:29:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:29:25.685 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:29:26 compute-0 nova_compute[192716]: 2025-10-07 22:29:26.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:26 compute-0 podman[233501]: 2025-10-07 22:29:26.816189436 +0000 UTC m=+0.047536708 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 22:29:26 compute-0 podman[233500]: 2025-10-07 22:29:26.871935651 +0000 UTC m=+0.109524154 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:29:29 compute-0 podman[203153]: time="2025-10-07T22:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:29:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:29:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 07 22:29:30 compute-0 nova_compute[192716]: 2025-10-07 22:29:30.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: ERROR   22:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: ERROR   22:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: ERROR   22:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: ERROR   22:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: ERROR   22:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:29:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:29:31 compute-0 nova_compute[192716]: 2025-10-07 22:29:31.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:33 compute-0 podman[233544]: 2025-10-07 22:29:33.821274909 +0000 UTC m=+0.066186178 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible)
Oct 07 22:29:35 compute-0 nova_compute[192716]: 2025-10-07 22:29:35.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:36 compute-0 nova_compute[192716]: 2025-10-07 22:29:36.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:40 compute-0 nova_compute[192716]: 2025-10-07 22:29:40.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:41 compute-0 nova_compute[192716]: 2025-10-07 22:29:41.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:44 compute-0 podman[233566]: 2025-10-07 22:29:44.817820674 +0000 UTC m=+0.054003586 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 07 22:29:44 compute-0 podman[233565]: 2025-10-07 22:29:44.83874477 +0000 UTC m=+0.079172755 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251007, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 07 22:29:45 compute-0 nova_compute[192716]: 2025-10-07 22:29:45.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:46 compute-0 nova_compute[192716]: 2025-10-07 22:29:46.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:47 compute-0 nova_compute[192716]: 2025-10-07 22:29:47.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:47 compute-0 nova_compute[192716]: 2025-10-07 22:29:47.992 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:47 compute-0 nova_compute[192716]: 2025-10-07 22:29:47.992 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:29:49 compute-0 podman[233603]: 2025-10-07 22:29:49.850216945 +0000 UTC m=+0.091690218 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 07 22:29:50 compute-0 nova_compute[192716]: 2025-10-07 22:29:50.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:51 compute-0 nova_compute[192716]: 2025-10-07 22:29:51.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:55 compute-0 nova_compute[192716]: 2025-10-07 22:29:55.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:56 compute-0 nova_compute[192716]: 2025-10-07 22:29:56.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:29:57 compute-0 podman[233628]: 2025-10-07 22:29:57.813949514 +0000 UTC m=+0.050681970 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 07 22:29:57 compute-0 podman[233627]: 2025-10-07 22:29:57.8738546 +0000 UTC m=+0.115227270 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 07 22:29:58 compute-0 nova_compute[192716]: 2025-10-07 22:29:58.986 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:58 compute-0 nova_compute[192716]: 2025-10-07 22:29:58.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.577 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.578 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.578 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.578 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.719 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.720 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.740 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.741 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5828MB free_disk=73.29877090454102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.741 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:29:59 compute-0 nova_compute[192716]: 2025-10-07 22:29:59.742 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:29:59 compute-0 podman[203153]: time="2025-10-07T22:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:29:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:29:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:30:00 compute-0 nova_compute[192716]: 2025-10-07 22:30:00.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:00 compute-0 nova_compute[192716]: 2025-10-07 22:30:00.932 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:30:00 compute-0 nova_compute[192716]: 2025-10-07 22:30:00.933 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:29:59 up  1:38,  0 user,  load average: 0.01, 0.15, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:30:00 compute-0 nova_compute[192716]: 2025-10-07 22:30:00.952 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: ERROR   22:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: ERROR   22:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: ERROR   22:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: ERROR   22:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: ERROR   22:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:30:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:30:01 compute-0 nova_compute[192716]: 2025-10-07 22:30:01.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:02 compute-0 nova_compute[192716]: 2025-10-07 22:30:02.471 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:30:03 compute-0 nova_compute[192716]: 2025-10-07 22:30:03.064 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:30:03 compute-0 nova_compute[192716]: 2025-10-07 22:30:03.065 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.323s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:30:04 compute-0 podman[233674]: 2025-10-07 22:30:04.82780726 +0000 UTC m=+0.072293246 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Oct 07 22:30:05 compute-0 nova_compute[192716]: 2025-10-07 22:30:05.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:06 compute-0 nova_compute[192716]: 2025-10-07 22:30:06.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:07 compute-0 nova_compute[192716]: 2025-10-07 22:30:07.065 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:07 compute-0 nova_compute[192716]: 2025-10-07 22:30:07.066 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:07 compute-0 nova_compute[192716]: 2025-10-07 22:30:07.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:07 compute-0 nova_compute[192716]: 2025-10-07 22:30:07.991 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:10 compute-0 nova_compute[192716]: 2025-10-07 22:30:10.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:11 compute-0 nova_compute[192716]: 2025-10-07 22:30:11.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:12 compute-0 nova_compute[192716]: 2025-10-07 22:30:12.987 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:15 compute-0 nova_compute[192716]: 2025-10-07 22:30:15.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:15 compute-0 podman[233695]: 2025-10-07 22:30:15.814861659 +0000 UTC m=+0.056073385 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 07 22:30:15 compute-0 podman[233696]: 2025-10-07 22:30:15.81488943 +0000 UTC m=+0.053067918 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 07 22:30:16 compute-0 nova_compute[192716]: 2025-10-07 22:30:16.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:20 compute-0 nova_compute[192716]: 2025-10-07 22:30:20.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:20 compute-0 podman[233731]: 2025-10-07 22:30:20.80401901 +0000 UTC m=+0.049885546 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 07 22:30:21 compute-0 nova_compute[192716]: 2025-10-07 22:30:21.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:25 compute-0 nova_compute[192716]: 2025-10-07 22:30:25.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:30:25.685 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:30:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:30:25.686 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:30:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:30:25.686 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:30:26 compute-0 nova_compute[192716]: 2025-10-07 22:30:26.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:28 compute-0 podman[233757]: 2025-10-07 22:30:28.86202477 +0000 UTC m=+0.100540614 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 22:30:28 compute-0 podman[233756]: 2025-10-07 22:30:28.92791583 +0000 UTC m=+0.167385791 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251007)
Oct 07 22:30:29 compute-0 podman[203153]: time="2025-10-07T22:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:30:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:30:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 07 22:30:30 compute-0 nova_compute[192716]: 2025-10-07 22:30:30.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: ERROR   22:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: ERROR   22:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: ERROR   22:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: ERROR   22:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: ERROR   22:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:30:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:30:31 compute-0 nova_compute[192716]: 2025-10-07 22:30:31.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:35 compute-0 nova_compute[192716]: 2025-10-07 22:30:35.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:35 compute-0 podman[233800]: 2025-10-07 22:30:35.852005705 +0000 UTC m=+0.089971838 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Oct 07 22:30:36 compute-0 nova_compute[192716]: 2025-10-07 22:30:36.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:40 compute-0 nova_compute[192716]: 2025-10-07 22:30:40.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:41 compute-0 nova_compute[192716]: 2025-10-07 22:30:41.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:45 compute-0 nova_compute[192716]: 2025-10-07 22:30:45.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:46 compute-0 nova_compute[192716]: 2025-10-07 22:30:46.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:46 compute-0 podman[233823]: 2025-10-07 22:30:46.817530389 +0000 UTC m=+0.054739017 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251007, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 07 22:30:46 compute-0 podman[233822]: 2025-10-07 22:30:46.841815813 +0000 UTC m=+0.083668585 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 07 22:30:48 compute-0 nova_compute[192716]: 2025-10-07 22:30:48.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:49 compute-0 nova_compute[192716]: 2025-10-07 22:30:49.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:30:49 compute-0 nova_compute[192716]: 2025-10-07 22:30:49.991 2 DEBUG nova.compute.manager [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 07 22:30:50 compute-0 nova_compute[192716]: 2025-10-07 22:30:50.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:51 compute-0 nova_compute[192716]: 2025-10-07 22:30:51.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:51 compute-0 podman[233860]: 2025-10-07 22:30:51.814998601 +0000 UTC m=+0.055876420 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 07 22:30:55 compute-0 nova_compute[192716]: 2025-10-07 22:30:55.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:56 compute-0 nova_compute[192716]: 2025-10-07 22:30:56.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:30:59 compute-0 podman[203153]: time="2025-10-07T22:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:30:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:30:59 compute-0 podman[203153]: @ - - [07/Oct/2025:22:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 07 22:30:59 compute-0 podman[233886]: 2025-10-07 22:30:59.814467248 +0000 UTC m=+0.046901930 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 07 22:30:59 compute-0 podman[233885]: 2025-10-07 22:30:59.847779703 +0000 UTC m=+0.087770603 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 07 22:30:59 compute-0 nova_compute[192716]: 2025-10-07 22:30:59.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.510 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.510 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.511 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.511 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.651 2 WARNING nova.virt.libvirt.driver [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.652 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.672 2 DEBUG oslo_concurrency.processutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.673 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5831MB free_disk=73.29875183105469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.673 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.673 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:31:00 compute-0 nova_compute[192716]: 2025-10-07 22:31:00.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: ERROR   22:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: ERROR   22:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: ERROR   22:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: ERROR   22:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: ERROR   22:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:31:01 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:31:01 compute-0 anacron[4762]: Job `cron.monthly' started
Oct 07 22:31:01 compute-0 anacron[4762]: Job `cron.monthly' terminated
Oct 07 22:31:01 compute-0 anacron[4762]: Normal exit (3 jobs run)
Oct 07 22:31:01 compute-0 nova_compute[192716]: 2025-10-07 22:31:01.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:02 compute-0 nova_compute[192716]: 2025-10-07 22:31:02.423 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 07 22:31:02 compute-0 nova_compute[192716]: 2025-10-07 22:31:02.423 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:31:00 up  1:39,  0 user,  load average: 0.04, 0.13, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 07 22:31:02 compute-0 nova_compute[192716]: 2025-10-07 22:31:02.442 2 DEBUG nova.compute.provider_tree [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed in ProviderTree for provider: 19d1aa8e-e3fb-43ab-9849-122569e48a32 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 07 22:31:03 compute-0 nova_compute[192716]: 2025-10-07 22:31:03.089 2 DEBUG nova.scheduler.client.report [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Inventory has not changed for provider 19d1aa8e-e3fb-43ab-9849-122569e48a32 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 07 22:31:03 compute-0 nova_compute[192716]: 2025-10-07 22:31:03.708 2 DEBUG nova.compute.resource_tracker [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 07 22:31:03 compute-0 nova_compute[192716]: 2025-10-07 22:31:03.708 2 DEBUG oslo_concurrency.lockutils [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.035s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:31:04 compute-0 nova_compute[192716]: 2025-10-07 22:31:04.704 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:04 compute-0 nova_compute[192716]: 2025-10-07 22:31:04.705 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:05 compute-0 nova_compute[192716]: 2025-10-07 22:31:05.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:06 compute-0 nova_compute[192716]: 2025-10-07 22:31:06.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:06 compute-0 podman[233932]: 2025-10-07 22:31:06.81996564 +0000 UTC m=+0.063429169 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Oct 07 22:31:07 compute-0 nova_compute[192716]: 2025-10-07 22:31:07.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:09 compute-0 nova_compute[192716]: 2025-10-07 22:31:09.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:09 compute-0 nova_compute[192716]: 2025-10-07 22:31:09.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:10 compute-0 nova_compute[192716]: 2025-10-07 22:31:10.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:11 compute-0 nova_compute[192716]: 2025-10-07 22:31:11.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:15 compute-0 nova_compute[192716]: 2025-10-07 22:31:15.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:16 compute-0 nova_compute[192716]: 2025-10-07 22:31:16.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:17 compute-0 podman[233951]: 2025-10-07 22:31:17.816193046 +0000 UTC m=+0.057166128 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 07 22:31:17 compute-0 podman[233952]: 2025-10-07 22:31:17.828198124 +0000 UTC m=+0.063470310 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251007, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 07 22:31:20 compute-0 nova_compute[192716]: 2025-10-07 22:31:20.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:21 compute-0 nova_compute[192716]: 2025-10-07 22:31:21.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:22 compute-0 podman[233992]: 2025-10-07 22:31:22.821838343 +0000 UTC m=+0.066442055 container health_status 9abf4f238fdf41b34fffc366b376f61eec35918398c709ec048690d3c4f55feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 07 22:31:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:31:25.687 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 07 22:31:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:31:25.687 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 07 22:31:25 compute-0 ovn_metadata_agent[103786]: 2025-10-07 22:31:25.688 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 07 22:31:25 compute-0 nova_compute[192716]: 2025-10-07 22:31:25.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:26 compute-0 nova_compute[192716]: 2025-10-07 22:31:26.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:29 compute-0 podman[203153]: time="2025-10-07T22:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 07 22:31:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19528 "" "Go-http-client/1.1"
Oct 07 22:31:29 compute-0 podman[203153]: @ - - [07/Oct/2025:22:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 07 22:31:30 compute-0 nova_compute[192716]: 2025-10-07 22:31:30.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:30 compute-0 podman[234018]: 2025-10-07 22:31:30.815705996 +0000 UTC m=+0.057200879 container health_status c22fac8db3ed0b541602b5de5b41ced62f94317a63b634499d053f13ea401675 (image=38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2)
Oct 07 22:31:30 compute-0 podman[234017]: 2025-10-07 22:31:30.855604682 +0000 UTC m=+0.101052609 container health_status 0a8207859936d4027a417a4efb998dc29310170e02f11cd0b3e964f59f7fb2ec (image=38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251007, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, config_id=ovn_controller)
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: ERROR   22:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: ERROR   22:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: ERROR   22:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: ERROR   22:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: ERROR   22:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 07 22:31:31 compute-0 openstack_network_exporter[205305]: 
Oct 07 22:31:31 compute-0 nova_compute[192716]: 2025-10-07 22:31:31.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:33 compute-0 sshd-session[234063]: Invalid user odoo15 from 103.115.24.11 port 40296
Oct 07 22:31:33 compute-0 sshd-session[234063]: pam_unix(sshd:auth): check pass; user unknown
Oct 07 22:31:33 compute-0 sshd-session[234063]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.115.24.11
Oct 07 22:31:35 compute-0 sshd-session[234063]: Failed password for invalid user odoo15 from 103.115.24.11 port 40296 ssh2
Oct 07 22:31:35 compute-0 nova_compute[192716]: 2025-10-07 22:31:35.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:36 compute-0 nova_compute[192716]: 2025-10-07 22:31:36.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:37 compute-0 podman[234065]: 2025-10-07 22:31:37.815924258 +0000 UTC m=+0.056434235 container health_status c9f4fd55cb3e4ea80e36f7817bd0330c6b5d511f9afb683bf862a8c5afb4e3c7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 07 22:31:37 compute-0 sshd-session[234086]: Accepted publickey for zuul from 192.168.122.10 port 39078 ssh2: ECDSA SHA256:OH28aSFFDwSUyue/q6XYLqn3ZIRCwAhLEh6x3UJ/Ca0
Oct 07 22:31:37 compute-0 systemd-logind[798]: New session 38 of user zuul.
Oct 07 22:31:37 compute-0 systemd[1]: Started Session 38 of User zuul.
Oct 07 22:31:37 compute-0 sshd-session[234086]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 07 22:31:38 compute-0 sudo[234090]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 07 22:31:38 compute-0 sudo[234090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 07 22:31:38 compute-0 sshd-session[234063]: Received disconnect from 103.115.24.11 port 40296:11: Bye Bye [preauth]
Oct 07 22:31:38 compute-0 sshd-session[234063]: Disconnected from invalid user odoo15 103.115.24.11 port 40296 [preauth]
Oct 07 22:31:40 compute-0 nova_compute[192716]: 2025-10-07 22:31:40.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:41 compute-0 nova_compute[192716]: 2025-10-07 22:31:41.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:42 compute-0 ovs-vsctl[234263]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 07 22:31:43 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 07 22:31:43 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 07 22:31:43 compute-0 virtqemud[192532]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 07 22:31:44 compute-0 crontab[234672]: (root) LIST (root)
Oct 07 22:31:45 compute-0 nova_compute[192716]: 2025-10-07 22:31:45.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:46 compute-0 nova_compute[192716]: 2025-10-07 22:31:46.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 07 22:31:47 compute-0 systemd[1]: Starting Hostname Service...
Oct 07 22:31:47 compute-0 systemd[1]: Started Hostname Service.
Oct 07 22:31:48 compute-0 podman[234864]: 2025-10-07 22:31:48.478295639 +0000 UTC m=+0.068804525 container health_status c57987061cd71f0313dfdead97ee745864d5a351363710b73078ee3e7bf118a2 (image=38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251007, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 07 22:31:48 compute-0 podman[234862]: 2025-10-07 22:31:48.481383649 +0000 UTC m=+0.071144763 container health_status bb2b2c57388697b3f98f2637e78f88faa32747776ede5fd3da655662ab921f71 (image=38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251007, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.12:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=iscsid)
Oct 07 22:31:48 compute-0 nova_compute[192716]: 2025-10-07 22:31:48.990 2 DEBUG oslo_service.periodic_task [None req-7578c25e-9225-428c-81e3-a4db0fd2d94b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 07 22:31:50 compute-0 nova_compute[192716]: 2025-10-07 22:31:50.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
